A self-driving car being tested by Google struck a public bus on a city street, a fender-bender that appears to be the first time one of the tech company's vehicles caused an accident. The collision occurred on Valentine's Day and Google reported it to California's Department of Motor Vehicles in an accident report that the agency posted Monday. The car was rolling at 2 mph and the bus at 15 mph. No one was injured.
UNMANNED CARS ARE A STUPID IDEA ON SO MANY LEVELS.
So we have AUTO-tonimous cars to give the driver the comfort of the car knowing how to navigate
through traffic, yet he has to be on guard in case it fails. Isn’t this added stress? And if it does get in an accident, is the driver aware of it before impact?
Here’s a good one from the article:
Google said its computers have reviewed the incident and engineers changed the software that governs the cars to understand that buses may not be as inclined to yield as other vehicles.
So now the tech guys are “adjusting” the car’s reaction based on the fact that they “think” and “understand” that buses will not yield. So they know the instantaneous reactions of every bus driver across America??
Another one:
The car's test driver - who under state law must be in the front seat to grab the wheel when needed - thought the bus would yield and did not have control before the collision, Google said.
I’m sorry, only human beings can have “thoughts’, not plastic circuits, therefore you can’t program
“thought process”. More reasons this is a stupid idea.
So WHO is at fault? The article says it’s a “he-said, she-said” even though we are dealing with an inanimate vehicle. If the fender bender offender was the car, who is liable? The car, the owner, the driver for insurance purposes? If a car gets enough self-driving traffic points is it banned from the roadways? What if it goes Rogue??
So many questions for a dumb idea.
Two words in Google's response scare the hell out of me; ... "inclined" and "yield." Neither word indicates an absolute. Both indicate a choice by others. Putting the words together into "inclined to yield" makes them even scarier if Google is trying to program a response to a possibly harmful, or even deadly, traffic situation.
The best that can be hoped for, and legally allowed, are computer systems that assist, rather than replace, human drivers.
I was about to complete that sentence with "... unless the grid is 100% robotic, as is done in some factory and warehouse situations", but then I remembered that accidents sometimes occur even in those "foolproof" environments.
As the article mentions, it's a dumb idea, whose only actual benefactor is Google themselves. Let them buy a few thousand acres somewhere, and build some roads so they can play with their toy robotic cars, far away from public roads. If another car clips my fender, my only recourse should not be connecting to a Google app, thus eliminating the satisfaction of looking into an offending driver's eyes as I exclaim; "WTF?!" ...unless he or she is bigger than me. Then I'll remain in the car and just whimper.
I like your style Chuck. This is just a bunch of techno-egos trying to convince everyone that just because you can develop technology, MUST be good for us. They should have had a road race set as kids , or went slot car racing, to get this out of their system. As I said before, you will have to pry my cold dead hands off the steering wheel before I turn my car over to an inanimate circuit board. Hey let 'em build their alien clown car tracks at Area 51!!