Tesla Model S and Chevy Bolt incidents prove autonomous vehicles still have a way to go.

Automakers GM and Tesla are once again coming under fire for safety concerns, this time involving two separate collisions of their vehicles while the “autopilot” feature was enabled. The Tesla had the misfortune of slamming into a parked government-owned emergency vehicle, while the GM struck a motorcycle, injuring its driver.

The separate incidents have murky stories. The Tesla crash is being investigated by the National Transportation Safety Board as the car slammed into a parked vehicle at 65 miles per hour. The investigation hopes to uncover why the Tesla had no stopping ability while its self-drive mode was engaged. Further confusing the issue is the update that Tesla sent out to all its vehicles following a traffic fatality in 2016; now, all self-driving models are supposed to come to a complete stop if the driver’s hands aren’t on the wheel. Either that didn’t happen in this instance, or the vehicle failed to stop for a large object in its path despite the driver still interacting with the car’s operation.

Chevy Bolt and Tesla Model S incidents prove autonomous vehicle tech still needs perfecting.

Chevy Bolt and Tesla Model S incidents prove autonomous tech needs perfecting.

Bolt from the blue

As for the GM crash, a Chevy Bolt changed lanes and sideswiped a motorcycle. The police report at the time says that the motorcyclist was at fault for changing lanes too suddenly. The injured driver, however, disputes that claim and says the Chevy simply changed lanes in front of him for no reason.

New regulations

These incidents have occurred at a fragile time for self-driving technology. Regulations are already under consideration that would allow a vehicle to operate on public roads with no one in the driver’s seat. GM has already announced that by next year it plans to launch a vehicle that has no driver controls, including a steering wheel and pedals.

Assist not full auto

One of the “user error” aspects to these and other similar crashes may stem from the technology’s name. Some experts worry that the term “autopilot” is misleading to customers, as the feature is more like “driver assistance.” The issue sounds a lot like a courtroom urban legend involving an RV owner who set the cruise control, got up to make a sandwich while traveling at 70mph, and crashed; as the story goes, the owner sued Winnebago for not informing him that he couldn’t get up from the driver’s seat despite engaging the cruise control, and won millions. Ideally, one would think that drivers with enough disposable income to purchase vehicles at this price point would be smarter than that.