There are a lot of reasons to applaud the innovations behind autonomous vehicles, or so-called ‘self-driving’ cars. Visually impaired individuals will theoretically have greater access to daily task independence, like driving to work or to the market. Commercial transit will ideally improve, with long-distance truck drivers no longer covering massive distances on very little sleep, yet still operating a potentially deadly vehicle.

google-self-driving-car_large_verge_medium_landsca

Unfortunately, too many people seem to be gleefully rubbing their hands in anticipation of the day when a self-driving car is free rein to drink and drive, or operate a car while otherwise under the influence. As even self-driving cars still require at least one sober, reliable driver to be ready to react and switch off the auto-pilot, the DUI idea is frightening, to say the least.

But there’s a very real danger that too many giddy consumers are overlooking, and that’s the very real threat of drivers who intentionally employ aggressive driving tactics. Whether it’s speeding, tailgating, weaving in and out of traffic, or other intentional and dangerous behaviors; the autonomous car can only do so much. Once the autopilot is switched off because the driver doesn’t want to travel at 30 mph in a school zone, the great innovation is once again no safer than any other car on the road.

And that’s where software comes in, according to an op-ed by Janusz Zalewski, Ph. D., a professor in the software engineering department at Florida Gulf Coast University. Dr. Zalewski’s research is in safety-critical systems and security mechanisms, and he feels that without software to curb aggressive driving, the autonomous car is just another cool gadget.

“What is at stake is how various kinds of disturbances affect the drive, to which the software must appropriately respond. Among those often dangerous elements one can include interacting with pedestrians, who are very unpredictable, sudden changes in driving conditions (especially weather), vehicles entering traffic unexpectedly or unexpectedly stopping in the middle of a road, drivers with malicious intentions, zombie drivers, and “hit-and-run” incidents. How do you write software to respond to something which is so unpredictable? It’s not the usual but definitely the unusual which makes it difficult.”

Unfortunately, Dr. Zalewski concludes that we’re decades away from resolving the safety and security concerns the self-driving vs human drivers raises. And with startling statistics on aggressive driving and road rage, it appears as though the real work should focus on the humans who have to release control of their cars to a computer, not the other way around.