“Self-driving cars? Madness! What happens when the cars get hacked? Or the electric circuitry gets all weird? Or, even worse, they become sentient evil beings? These cars must be banned before they even hit the road!”
Those are the lines we imagine many people have had run through their heads when they first heard Google was working on self-driving vehicles, but Google’s cars have actually proven to be much safer than many people could have anticipated. Head of the project Chris Urmson revealed that Google’s cars have only been involved in 11 accidents in the past 6 years.
While that number alone should be low enough to quell fears, this fact should drive that point home: none of the accidents were caused by fault of the vehicles, and they all resulted in 0 injuries and only minor damage. The most common cause of accidents were other drivers rear-ending the driverless cars, which Google says accounted for 8 of the 11 accidents.
Those are some pretty strong numbers to be able to tout, and Google’s sure to keep it in their back pocket should they ever face strong resistance from road regulators due to questions of safety.
Other choice statistics from Urmson’s latest blog post:
- Google self-driving cars have traveled a combined 1.7 million miles in testing combined
- Of the 1.7 million miles, nearly 1 million were autonomous, the rest being manually controlled by the test drivers
- Google is averaging 10,000 self-driven miles per week, which is just less than a typical American drives in a year
- Driver error causes 94% of road crashes
- In America, 660,000 people at any given moment have their attention split while behind the wheel (texting, calling, eating, etc)
- Meanwhile, Google’s self-driving cars have 100% of their attention focused over all 360 degrees of their surroundings
You can read Urmson’s full post over at Medium right now if you’re interested in a deeper look at how their self-driving car project has evolved to become one of the most safe driving experiences on the road today.
we love google, we love google, we love google, we love google~~~!!!!
All praise our Google overlords
my car drives itself. I call it ghost riding the whip.
“The most common cause of accidents were other drivers rear-ending the driverless cars,”So…Google’s driver-less cars are randomly slamming their brakes in traffic?
That’s safe. /s
(The assumption they are expecting us to make assures them they don’t have to state any actual facts regarding safety. Don’t make the assumption they want you to and suddenly it doesn’t look like such a happy statement.)
See below. “rear-ended” only signifies who was “legally at fault”, but does not in any way speak to how safely the vehicle was or was not operating.
I had similar thought when I read the article. Google car must be applying the brakes in unnatural fashion. It’s called brake checking when a person does it, and it’s not cool.
There are literally millions of rear-end collisions each year. I don’t think 8 over 6 years is necessarily an indication of a problem in operation.
People only bake check when someone is riding their a$$. Don’t ride peoples’ a$$ and you won’t get checked.
This doesn’t negate the point that “brake-checking” is an unsafe act.
I wasn’t commenting about the safety of the act, only the usual cause.
I suppose this is one less thing to worry about as driverless cars emerge: They won’t brake check. Also, riding their a$$ to get them to speed up will have no effect.
“They won’t brake check”
Well, we can hope. ;-)
Why would it possibly require “hope?” Tailgating is something these cars will do on purpose for the sake of efficiency. The only reason “tailgating” is something we consider bad is because humans don’t have the reaction speed to travel so closely together at speed without the colliding. Not only does the use of a computer driver eliminate the need to play psychological driving games (like brake checking), it eliminates the entire situation in which such a practice would take place.
Jokes. They’re not meant to be taken seriously.
Hope? Why would they be programmed to do that?
Sheesh…settle down, folks.
Is your PC programmed to bluescreen? Is your phone programmed to FC?
Lighten up.
tailgating is also an unsafe act
I only do it if you’ve been tailgating me for a long time and are literally so close that you’d hit me if I actually had to stop for something.
8 rear-end collisions in 1.7 million miles (a very low total) implies the exact opposite… they’re braking properly. If they were brake checking, there’d be far more accidents.
How does one apply the brakes “in unnatural fashion” ?
Have you ever driven behind a vehicle running on ACC? The way they speed up, especially in heavier traffic is very unnatural (to me).
Based on my observation, when a human is controlling the speed, he tends to base his speed on the overall flow, and adjusts accordingly (this feels natural to me). However, vehicles running on ACC seems to only considering the speed of the vehicle right in front, and the way it accelerates/decelerates isn’t always natural.
When other vehicles do something that’s unexpected (unnatural), it can lead to dangerous situation.
I’m not going to “assume” anything. Every driver is responsible for driving as safely as possible. If someone hits you from behind, it is virtually never your fault, regardless of why you stopped. A basic rule of the road requires a vehicle to be able to stop safely if traffic is stopped ahead of it. If it cannot stop safely, the driver is not driving as safe as the person in front.
took the words right out of my mouth
“If someone hits you from behind, it is virtually never your fault, regardless of why you stopped.”
This is completely disingenuous. We’re talking about creating a dangerous situation, not about “who’s at fault”. Fault doesn’t matter here. This is exactly the kind of reaction Google is expecting.
“Oh, if they aren’t at fault, it’s totally safe.”
This is not only an incorrect correlation, but demonstrably false. You can be the most hazardous vehicle on the road while obeying the law. Google saying they were rear-ended completely skirts any possible safety issues by directing the discussion to “fault” rather than “safety”.
You got played.
FWIW:
It is entirely possible to be legally not at fault, and to have caused the rear-end.
“If it cannot stop safely, the driver is not driving as safe as the person in front.”
The diving safety of the person behind has no bearing on how safely the person in front is driving. I’ve been backed into in parking lots and even had a delivery truck back into a pickup truck which then got pushed into me. My driving skills didn’t enter into it.
I dont think you understood fully, let me clarify, and nit-pick it for you. In the very specific example of, a car driving forward, and hitting another car in the rear…. the car who drove into the rear of the other car is at fault….. in that specific example only… Of course if somebody else backs into you, they’re at fault, not you, of course, if somebody else rear-ends you, forcing your car to rear-end collide the car in front of you, then fault lays with the person who started the chain reaction. In Google’s case, 8 of the 11 accidents were not the fault of Google because the car was read-ended by some human driver who should have been driving safely. Its a common knowledge rule of the road that you should drive at a safe enough distance ALWAYS, to be able to stop in time before colliding with the car in front of you.
I was talking about general driver safety which is not specific to Google. When a police officer or insurance company gets involved, from a legal standpoint someone must be at fault. Whether you feel it’s correct or not doesn’t matter. Both scenarios that you have been involved in are not the same as a car collided with the car in front of it while both vehicles are driving in the same direction.
Which is all aside from the point of vehicle safety in relation to self-driving cars.
Being rear-ended in no way implies the vehicles were operating safely. That’s the point missed when we read” rear-end’ and start discussing blame. Not the point.
Google is, in essence using the fact that mentioning “rear-end” will derail any discussion of safety, by turning into a blame-game.
…and it’s apparently working.
tl;dr
Slamming on the breaks on the highway, at highways speeds, may be 100% legal and “not your fault” if you get hit.
Not the point.
Slamming on the breaks at highways speeds is not safe.
That’s the point.
Claiming any accidents were “rear-end” does not in any way imply the vehicle rear-ended didn’t do something wrong/stupid/unexpected (which is what Google is trying to imply); which while it may not be a legal issue regarding fault, is most certainly a safety issue.
…
Simply:
From the statement: “They were rear-ended.”
We can easily determine which vehicle was legally at fault.
We cannot determine the self-driving vehicle was, in fact, operating safely.
This is basic logic.
You are absolutely one hundred percent correct. However, it does go both ways. We cannot determine the self-driving vehicle was, in fact, operating unsafely.
Heh.
That was my point, though perhaps not stated clearly (sarcasm never seems to carry over well).
The made a claim that basically didn’t mean anything, as an attempt to imply the vehicle was safe.
one of the accidents in question was a car stopped at a red light. Human driver sideswiped this vehicle. Ignoring the legal jargon surrounding the “rear ending” situations, you can’t deny that most human drivers are idiots, and a fully autonomous road is a safer road.
“You can be the most hazardous vehicle on the road while obeying the law.” This would be my dad. His claim to fame is that he’s never had a ticket but I have to remind him that it’s because there were no cops around when he was driving in the oncoming traffic lane.
I was thinking more along the lines of my sister doing 50 in a 55 while the rest of traffic blazes by her at 70. (She has issues with people who read “limit” and see “minimum”.)
At this point, if she were to change lanes, she would create a situation where the normal flow of traffic would have to adjust ~20 MPH to avoid collisions. The actions of one law-abiding citizen is putting all of the people around her at risk.
Legal is the opposite of safe in this instance.
These self-driving cars are going to have to be able to tell the difference between safe speeds and the speed limit (while they share the road with human drivers). They rarely jive on highways these days.
(I know in some muni’s that officers will pull people over for driving 20 above or below the current flow and give them warnings…which is a start I suppose.)
This makes me think of the “reasonable and prudent” law in Arizona. Regardless of the actual speed limit is if the surrounding traffic is moving at a much faster or slower speed, you have a legal obligation to adjust your own speed to reduce the risk of just what you describe. What’s so stupid about it is that there’s no practical way to prove you were being “reasonable and prudent” so if you get a ticket for speeding, good luck arguing that you were following the law.
Yep. This gets even more complicated when you add autonomous vehicles into the mix.
It will definitive be interesting to see how it’s handled. I, being ever the pessimist, am quite certain “badly” will describe it quite well. ;-)
In Australia, L and P1 license holders are limited to 90km per hour (roughly 56mph) while the speed limits can often be up to 110kmph (almost 70mph) Merging onto a freeway at 90 is a horrifically unsafe experience, however going any faster is unlawful. I’d rather break the law than cause an accident in those situations.
I can’t imagine that the Google Cars have a habit of cutting people off and stopping immediately.
I can’t either – I was simply remarking on the uselessness of the “but they were rear-ended” bit of their spiel.
*shrug*
I found it amusing. I didn’t expect a sort of Spanish Inquisition. ;-)
8 out of 11 were rear-end collisions. Hah. And I wonder how many of those were because the driver was staring at the radar contraption on the roof of the robot car.
1.7 million miles driven and only 11 accidents (only 1 accident per ~150,000 miles driven) is even more impressive when you consider the crazy look of the cars. My experience as a driver leads me to think that those 8 rear-end collisions were the self-driving car actually stopping for red lights, when the following car was assuming the standard human driver behavior of speeding up to beat the light. Another thing that I could believe is that the drivers were distracted by Google… but on the phone they were staring at rather than paying attention to the car (with the equipment on the roof that they also didn’t notice) slowing down.
Sure, and rushing through yellow/red lights would be a thing of the past when almost all cars are self-driving, since traffic could interleave without stopping at the intersection instead of taking turns.
Absolutely. I eagerly look forward to the day when this technology takes hold. I can see the day when the traffic news shows that the biggest traffic delays are the result of the self-driving cars having to dramatically slow down to accommodate the presence of a human-piloted vehicle.
Even though I thoroughly enjoy driving, I’d support banning human-piloted vehicles on most roads so that we can take full advantage of the efficiency, safety, and convenience of this technology. I’m not sure that a ban will be needed, however. It won’t take long for insurance companies to see the enormous benefit, and start jacking the premiums for human-piloted vehicles through the roof. It will only be most absurdly wealthy that will even be able to afford to drive, and these are the people that already have people drive them around, and can also afford to spend money on track-only cars — an activity they will still be able to engage in for sport.
With your logic that would still be the fault of the driver and not the autonomous car. By the VTL you suppose to slow down not speed up at on a yellow signal. If they’re so busy looking at the equipment on top of the car ahead of them, it’s still their fault. They could’ve been looking anywhere, at a billboard, at some girl/guy walking on the street, etc…..
Just yesterday I told a buddy of mine that I think it would be much safer if only robotic cars were on the road. No more fokers would cut off other people, force them to switch lanes, etc., etc. I would love the roads of tomorrow.
The issue is that you will likely never have that. There will always be people who want control of their own vehicle and there will be antique collectors that will want their older cars (non-robotic) to be able to go on the road. The real issue is whether or not Google’s cars can handle that. So far so good but I would also want to know where that 1.7 million miles was driven. Was it in LA, DC, NY or Atlanta? I doubt it. Other drivers would had likely forced the robo-car off the road or run it over.
Astro Boy.
Johnny Cab?
Are we sure that the accidents weren;t caused by the auto pilot? Because it sounds like the MOST IMPORTANT thing Google would want to cover up. Even one accident could be enough to ban this thing. I wouldn’t put it past Google’s employees to cover something like that up if it were to happen.
That being said, even if all 6 were caused by the software it is still safe enough at this point of testing for me to want to buy one when it finally makes it into an actual product
I’m not sure of anything, but the numbers suggest a lot. 11 accidents in 1.7 million miles, and 8 of those were the auto-piloted car getting rear-ended. The accident rate is already well below that of human-piloted cars.
True, but given how bad publicity practically killed the 1st version of Google Glass, I think Google won’t risk any bad publicity with this. This is too big and costly, it can’t fail….to the point where I think they WOULD lie/cover it up if it was the case.
Not saying that this is a cover up, just being skeptical is all
Regardless of whether they were caused by auto pilot or not, 11 accidents with no injuries in that many miles driven is very low, something Google would want to shout from the rooftops, not cover up.
It’s like shouting your OS has very few bugs……as few as it may be, having any is just wrong.
My new OS has only a 1 in 10000000000000000000000000000000000000000000 probability to turn into SKYNET, so it’s safe enough. Load up those nuclear codes!
Does your tinfoil hat need adjusting?
yes
“Even one accident could be enough to ban this thing.” This is an unreasonable standard to judge the safety of a self driving car. We would not apply this same stringent standard to human drivers.
Self-driving cars should be judge by a more reasonable standard, perhaps similar to how we would judge the safety of human drivers. See how one of these self-driving cars do on a standard driver’s exam and compare it to how human drivers do.
Nah, with human drivers you know who to blame/sue (driver or car manufacturer and usually it is very clear which of those two is at fault)
With automatic cars, do you sue, the manufacturer of the OS, the manufacturer of the software that runs on the OS, the driver of the car, or the Manufacturer of the car?
I am also not saying this is the RIGHT course of action to judge a car’s safety, just saying that competitors/haters/protesters will blow it up in the media, killing the concept before it reaches general acceptenace (coughGoogleGlasscough)
There will always be people who will resist self driving cars regardless of the benefits over human driven cars. I can understand the liability issue when it comes to insurance, but I do expect that these legal issues will be sorted out before they are certified and can be sold to the public.
I expect self driving cars will hit the roads at some time in the future. I believe that they can be made safer than human driven vehicles. Any protests will at best delay their implementation.
True, but with an investment you want to see the return on it ASAP.
So if car manufacturers who don’t want it will lobby with the government an accident that was caused by the auto-car can be blown out of proportions and delay the roll-out. That is why I think Google and other auto-car manufacturers have a big reason to cover up IF such an accident (regardless of severity) would occur.
That’s all I am saying. Personally for me, I hate driving (I like being driven) so I can’t wait till I get to own/lease/rent one
by the logic of “one accident would be enough to ban this thing” then we should ban all human drivers after their first accident. I’m not entirely opposed to this, as it would mean there’d be about thirty drivers left on the planet. Getting to work would be so much easier
Nah, with human drivers you know who to blame/sue (driver or car manufacturer and usually it is very clear which of those two is at fault)
With automatic cars, do you sue, the manufacturer of the OS, the manufacturer of the software that runs on the OS, the driver of the car, or the Manufacturer of the car?
Telling me how many accidents happened without telling me how many cars are on the road = not telling me anything. But is sounds like a good sign.
That wouldn’t matter. The unit that’s important is miles logged. All these vehicles are equal, using the same system. If it’s one car out there, or 1k it doesn’t matter. Distance covered, or actual use of the system is more important.
So by your reasoning, one car driving 100 miles all by itself has the same chance of getting into an accident as 100 cars all driving one mile on the same road? Except in one scenario there would be nothing to hit. So, no, I think the number of cars out there driving is quite important.
I thought you meant autonomous vehicles on the road.
What you’re asking will never be known.
Ok, I’ll try this another way since you seem to be having trouble with this. Take Google’s 1 million miles driven for 11 accidents. Would that stat mean the same thing to you if I said there were a total of 5 Google cars in existence as if I told you there were 10,000 Google cars in existence? I would certainly give more weight to the larger sample size of vehicles.
The one that seems to be having trouble is you. Make up your mind if you’re talking about autonomous vs human driven vehicles because you keep flip flopping.
If you’re referring to total Google autonomous vehicles, that is still not a valid factor. Why? Because they’re testing the system itself, not the actual vehicle. The cars DON’T MATTER, it’s the autonomous system that matters. The hardware is being used by numerous car manufacturers and it works, it’s the coupling of the software with all the given hardware that matters.
Similar to testing an OS on a device. We’re testing the OS, not the device.
“…would certainly give more weight to the larger sample size of vehicles.”
Obviously, larger sample size of use of the system is a better representative but you can’t disregard the given data completely.
And to quote your original statement
“without telling me how many cars are on the road = not telling me anything.”
In regards to your prior statement–
“So by your reasoning, one car driving 100 miles all by itself has the same chance of getting into an accident as 100 cars all driving one mile on the same road? Except in one scenario there would be nothing to hit. So, no, I think the number of cars out there driving is quite important.”
If you’re referring to vehicles AROUND the autonomous vehicle, still not a factor. Humans get into accidents with obstacles and other vehicles all the time.
It’s similar to the question of ‘if a tree falls in the forest does it make a sound’. An accident is an accident, and a sound is a sound, no matter the surroundings.
“So by your reasoning, one car driving 100 miles all by itself has the same chance of getting into an accident as 100 cars all driving one mile on the same road?”
All other things being equal, yes.
So, the one car by itself, who will it collide with?
How do we know Googles cars are safe? Because they said so.
I would expect that at some point, the self-driving cars will need to be independently tested to determine their safety and road worthiness. I would be interested to see how well these cars performed compared to human drivers.
Oh there will be scandals.
Until they start letting them out in anything besides nice sunny weather, these statistics mean very little. Collisions go way up anytime adverse weather conditions come into play. And since I have yet to see a vehicle who’s anti-lock brakes actually work right in the ice and snow, I’m not going to be trusting all the rest of the car’s controls to a computer anytime soon.
Id trust a computer driving in ice and snow more than the current crop of morons on the road. I think a computer would exercise common sense and NOT drive 70mph on a wet or icy road, unlike 99.9% of expensive car owners who seem to drive more recklessly the more they paid for their car.
I don’t know where you are getting this flip-flopping from. You try to dissect me too much. I want to know how many Google cars are we talking about generating that statistic they are toting, and in what conditions are they driving. I like the stat they dangle in front of me, sure, but I can’t conceive what it really means without knowing those basics. Lies, damn lies, and statistics. And mileage is not the same. As for the rest of your nonsensical yapping, Google is texting their system, great, but what they’re doing here is advertising to you through this story and trying to build public confidence in their product. You seem to be eating up what they’re selling here, which if fine and you may be right. I would like a little more info before I take that stat to mean anything to me.
If it can take me to Vegas and back to LA …..I want it now!!!!
Google needs to stop. Self driving cars will not be a thing for decades.