A report suggests states should test and certify future autopilot cars’ mettle before turning them loose on the roads
By Jim Gorzelany
CTW Features
Every state in the union requires motorists to pass a battery of tests in order to be licensed to drive, so why not hold tomorrow’s autopilot vehicles to the same standards? That’s a notion recently discussed in a white paper written by Michael Sivak and Brandon Schoettle of the University of Michigan’s Transportation Research Institute.
Just as with human drivers, whose eyesight, knowledge of the rules of the road, psychomotor skills and driving experience will vary from person to person, so too can we expect the performance of autonomous vehicles to differ from one model to another. Already, the website Autoblog reports that within a few days of being released, the luxury-electric Tesla Model S sedan’s rudimentary autopilot software caused its owner to be issued a citation for speeding in Florida for doing a lead-footed 75 mph in a 60-mph zone.
While Sivak and Schoettle say self-driving vehicles should have no problems passing even the strictest visual-acuity test, they argue that autopilot vehicles could be expected to perform differently depending on the quality of their sensing hardware, special maps and software algorithms.
Just as humans’ visual acumen varies, road-monitoring cameras and sensors might perform better or worse under rainy or snowy conditions, depending on the model. Onboard sensors and cameras can easily become mucked up by snow or mud, for example. To that end the report quotes a Google spokesperson as admitting the company “doesn’t intend to offer a self-driving car to areas where it snows in the near term.” (Google continues to extensively test a small fleet of self-driving cars in preparation for introduction in the coming years.) Potential problems could likewise arise regarding the sensors’ visual pattern recognition, which the authors say is a skill at which human drivers excel, but one that’s difficult for computers to master.
While we’d like to hope self-driving cars would come fully programmed to obey state and local traffic laws, there’s the possibility they might adhere to the regulations a tad too strictly for their own good. For example it would be perilous for a self-driving car to merge onto a highway while steadfastly adhering to the posted speed limit if prevailing traffic is whizzing by at a much faster pace.
Autopilot cars are further likely to face critical ethical dilemmas in emergencies, and may be forced to choose which of several more or less perilous courses to pursue in a split second. An autonomous car might have to decide whether it would it be best to slam on the brakes but ultimately crash into an obstruction and risk injury — or worse — to the car’s occupants, swerve in one direction to avoid a collision and possibly veer into oncoming traffic or steer in the other direction and perhaps run over a pedestrian standing at curbside.
At best, the authors conclude that the technology driving autopilot cars is still not perfect, and one model could be expected to perform differently than others under certain circumstances, whether by design or default. A given car’s self-driving acumen could be better or worse during the day or nighttime hours, over smooth or broken pavement or under perfect versus rainy or snowy conditions.
Sivak and Schoettle certainly make a strong case for rigorous testing and licensing of autonomous cars — albeit according to a series still-to-be-written standards — and if this does in fact come to pass two things are certain: Like human motorists, autonomous cars will be subject to an intolerably long wait at the local DMV office, and their driver’s license pictures will look nothing like them.
© CTW Features