You have probably heard the news of the death of Joshua Brown, a popular Tesla Model S owner and enthusiast, who died in a freak

Current self-driving cars have a weakness

Posted by

You have probably heard the news of the death of Joshua Brown, a popular Tesla Model S owner and enthusiast, who died in a freak accident recently. Autopilot failed him at a critical moment. As his car approached a junction on Autopilot, a trailer made a sudden pull in front of the car. Perhaps it was the suddenness of the move, perhaps it was the car fooled by the colour of the trailer against the sky, but Autopilot failed to turn the it away. It ended up underneath the trailer and off the road. The accident revealed that self-driving cars have a weakness.

self-driving cars have a weakness

Cool, Not Perfect

The Autopilot feature is a cool, even highly useful, thing to have in a car. Think of the convenience. Yes; it also enhances safety. Yet, I have always suspected that there would be scenarios in which self-driving cars would be more limited than a human driver. And those scenarios are the critical ones. Let me explain.

On a road where every other vehicle is self-driving, driving around would be 100% safe – or close. Each of those vehicles would be operated by pure computer logic. However, as long as there are other cars out there driven by humans, the autopilot car is suddenly at risk in some situations.

Self-driving cars have a weakness: Humans

Reason: human beings do not always act logically or rationally and it is at those points that the self-driving car is disadvantaged.

The Tesla Model S uses a combination of forward-facing camera, forward radar, ultrasonic sensors and GPS to be aware of the road and other vehicles before and around it. But it cannot (yet?) read human expressions and nuances. Its huge weak point is its inability to read humans.

Right Of Way Issues

Have you ever driven in Lagos or taken a road trip from Lagos to Zaria? Many times, it is human eye contact with the driver of another vehicle that gives you a hint of what he is about to do (without warning). Sometimes, it is a quick glance at the front wheels of the truck in front of you that gives you a hint that something is wrong or about to go badly wrong.

If the self-driving car has the right of way, but a grumpy driver of a danfo or BRT (both commercial bus types in Lagos) suddenly acts on an impulse to pull a James Bond, what happens? The driver pulls out from a side lane into the way of the Autopilot car, I suspect that the automation will fail to read all the signs that a human would have read in that situation. Humans act irrationally everywhere.

Maybe self-driving technology will get to that point of proficiency sometime soon. Or maybe not. Perhaps at some point in time, human driving will be phased out totally. But there can be no doubt that the greatest threat to the safety of a self-driving car are other cars driven by a human.


  1. Perfect timing.

    A few minutes ago, i read a post about the arrival of first driver-less car in Nigeria. The first thing that came to mind was: How on earth is it gonna run smoothly on lagos traffic?

  2. The accident revealed that self-driving cars have a weakness.

    No, I don’t think this accident revealed any weakness. The guy who died was going at a terrific speed, watching a DVD. If a vehicle swerved into your lane while going at 180km/hr, there would he an accident whether a human is driving, or a robot is in control.

    It is amusing that, given the number of fairly accident_free kilometers racked up with the help of driver_assist, a single accident is being blown out of proportion to make it look like there is something inherently wrong with driverless technology.

    Any human made system cannot be perfect, bit it is clear that humans are better off being off the steering wheels.

    Accidents would happen. Even a human driver can’t avoid some kinds of accident. Why do we think even the best automated driving technology would be able to do so?

    If a self driving car with a human on board has to choose between crashing off a hill and crushing a jaywalker to death, which would it choose.? Would a human driver even be trusted to do the right thing (what’s the right thing?)

Leave a Reply

Your email address will not be published. Required fields are marked *