GOOGLE SELF DRIVE TECHNOLOGY CANNOT ERASE HUMAN FLAWS
In the recent news ( yet again), Googles pioneering Self Drive Car has had its first injury involved accident. The car involved was a Lexus SUV outfitted with the self drive sensors and cameras. On 1st July it was travelling through traffic in Googles home town of Mountain View when it was rear ended in the approach to an intersection. The two cars in front of the google car stopped at the junction, the google car travelling at 15mph behind them also stopped, whilst a further fourth car travelling at 17mph failed to stop and collided with the Google Lexus. In car telematics showed that the fourth car responsible for the crash did not brake. The driver of the fourth car clearly must have not been paying due attention to the traffic in front and although no police accident report was filed, Google did file a report to the California Department of Motor vehicles.
In this incident there were three injuries to passengers of the Google car. One was a driver that California law requires to be present in order to take control of the self drive car in an emergency, the second was a passenger that Google employs to take reports and observations on Google cars journeys and the third was another employee. All suffered only minor whip lash and were cleared for work.
Google has been in the media before in regards the new self drive technology. It has a lot to prove and is very keen to ensure that all accidents that we see in the media are not blamed on their technology failing. In deed they are not wrong, so far they have had 14 collisions. 11 of which were due to their self drive cars being rear ended. Another collision was a google car rear ending another car, but this turned out to be an employee driving at the time and not the self drive technology.
Chris Urmson the head of the Google Self Drive Tech recently blogged ” Googles SUVs are being hit ‘surprisingly often’ by distracted drivers, perhaps people looking at their phones” What Google believe, is that these crashes are doing nothing to damage the Google self Drive reputation and credibility, more to contrary, they are actually proving an argument in favour of their technology.
URMSON STATES ” THE CLEAR THEME IS HUMAN ERROR AND INATTENTION. WE’LL TAKE THIS AS A SIGNAL THAT WE’RE STARTING TO COMPARE FAVOURABLY WITH HUMAN DRIVERS”
In a telephone interview Urmson commented that his team were currently exploring whether their cars could do anything to alert distracted drivers before a collision… he stated that they had considered honking, but was worried this would annoy the residents of Mountain View.
Quite clearly Mr Urmson is enjoying a humorous take on the events in Googles home town and also the medias frenzy over the overstated collisions. After all he does seem to make a good point. Google have tested their cars over 1.9 million miles with 20 prototypes. 14 collisions is possibly not a lot. Especially when they have clearly been due to human error in other cars.
BUT THATS JUST THE POINT…
The roads are full of other drivers that do not plug into a lap top or sync with a google control system. Drivers that have a better battery life and un programmed thoughts! And the issue here is not that the Google cars are better drivers at all, we as humans can very often predict and react to the mistakes of others a
nd adjust our driving to avoid collisions.
WE CALL IT DEFENSIVE DRIVING.
AIM: TO REDUCE RISK OF COLLISION BY ANTICIPATING DANGEROUS SITUATIONS, DESPITE ADVERSE CONDITIONS OR THE MISTAKES OF OTHERS
I’m sure that Google cars will work perfectly on a road populated by other self drive vehicles. But they are not going to be doing this… well I personally hope not for a very long time! So the question is not can they drive, but more.. Can they drive with other road users safely?
Lets imagine a scenario where a self drive car encounters a learner driver. It is likely that the learner driver will at some point make an error of judgement, with normal road users we would already ( heres being hopeful again!) expect this may happen and we would give the learner more space and consideration. We may predict that as they come towards us they may cut the corner and enter our own carriageway as they struggle to master the steering. So we adjust our own course to fit around them. We might not be happy about it, but we know these things can and will happen and that we all make mistakes …. well us all being the non robot variety!!!! The examples of such scenarios are not hard to imagine, we’ve all seen a car in front behaving with hesitation and suspected it may try to change lanes without indicating and therefore treated it with caution. Most of these suspicions are right, we can pretty much rely on each other to make errors of judgement, of we didn’t insurance companies wouldn’t be able to charge young drivers more.
THIS PREDICTIVE SENSE IS OUR HUMAN INSTINCT.
The amount of times I will say to a passenger whilst driving ” I knew they were going to do that!” as someone ( very often in a white van) tries to cut into an impossible space or a young lad pulls out at the last minute causing me to slam my breaks on is considerable. I as an experienced driver could see the warning signs and anticipated their error. This is human instinct and what Google call ” a collision that was not our fault” . Yes Google, your driver less car may have not caused the collision… But it did’t avoid it either. When all other road users have to take a driving test assessing not only their ability to get from A-B, but to do it safely, it seems crazy that a driver less car doesn’t. If they cannot drive safely with other road users then i’d say that is a fail!
If we look at the majority of the collisions that the driver less cars have so far suffered, they have mostly been from other road users hitting them from behind. In defensive driving if we had been in the google cars position, we would be checking our rear mirror and if it looks like a car coming towards us cannot stop in time we would move forward or to the side into the extra space left between our selves and the car in front. Therefore avoiding such an accident. You would have thought that after so many rear end collisions the driver less car would have learned from his (her? it?) mistakes. It could learn and adapt, building up road experience so it could read the road better and avoid such accidents in the future….. but I guess its just not that clever and after all you could just add a horn!
For the moment at least, google are happy to report on collisions, after all they can say they were all human error and it suits their cause too. But I will always be in favour of a human driven vehicle, what can I say ” I prefer to keep my bumper in tact!”
The India Times ( picture insert at top of blog)