California is one of the states where testing of autonomous vehicles has been occurring, but according to one professor, there is a flaw in the way these vehicles learn to drive. He says that self-driving cars should not be taught to drive like humans because they then make the same errors that humans do.
The professor, who teaches at Arizona State University, says this is the reason an autonomous car killed a pedestrian in March. The car was traveling on a dark stretch of road, and the pedestrian stepped in front of it in an area where there was not a crosswalk. However, the professor says that self-driving cars should not travel at a speed that exceeds their ability to stop if an obstacle appears in front of them. In this case, he points out, the car was behaving like a human driver, assuming that there were no obstructions in the road ahead despite the lack of visual confirmation. He says that autonomous vehicles must proceed as if there are obstacles in the areas it cannot see.
He works on computer systems that control physical objects with guaranteed responses. For example, he says, they examine how an autonomous vehicle could stop within a millisecond of detecting an obstacle. Autonomous cars must not make errors, he points out, because an accident could destroy the industry.
Because human error, such as drowsy driving or driving under the influence, causes most car accidents, autonomous cars are still expected to make roads safer even if they do not have a perfect record. However, it will be years before they are widespread, and in the meantime, human drivers will continue to make mistakes that lead to accidents. People who are injured in a car accident and who are struggling to obtain compensation from the responsible driver’s insurance company might want to consult an attorney.