Self Driving Uber Accident
The first gasoline-powered automobile was invented by Karl Friedrich Benz in 1885 – the first recorded automobile accident occurred soon after, in 1891, when a car ran into a tree. Legend has it that the only two automobiles in Ohio collided with each other for the entire year in 1895, a story that, while unproven, makes a pretty good joke. The fact is, new technology inevitably brings about new risks.
New Vs Old Technology
Modern cars have been becoming ever more autonomous. Backup cameras, collision avoidance systems, and parallel parking assist have been around long enough to be almost commonplace, but the ultimate step in automotive technology, self-driving cars, are an innovation. Twenty-two states have enacted legislation related to self-driving vehicles, and the governors of another ten states have issued executive orders on the subject. Road tests of self-driving vehicles have been quietly happening in California, Ohio, and Arizona since 2015, although the tests have required a human driver to be in the vehicle in case of emergencies or malfunctions. The governor of Arizona, Doug Ducey, recently signed an executive order allowing for road tests without a human driver. After a fatal accident involving a pedestrian on March 18, though, road tests in Arizona have been suspended.
A self-driving Uber vehicle with a human driver in the driver’s seat struck and killed a 49-year-old woman as she crossed the road; investigators are still trying to determine how both the autonomous vehicle and the safety driver could have failed to see her. Apparently, the safety driver’s attention was distracted immediately prior to the accident, and the driver did not notice the woman until it was too late. It is still unclear how the vehicle malfunctioned, but investigators have speculated that the car’s software registered the pedestrian as a “false positive.” The object avoidance software in an autonomous vehicle can be set to different levels of sensitivity. At the highest level, the car would stop whenever an object was detected in the vehicle’s path, or close to the vehicle. That would, of course, cause vehicles to stop or slow down constantly for objects that pose no danger, like a plastic bag or a pine cone in the road or a bicycle on the sidewalk. That would lead to very slow, jerky rides. At lower levels, the ride is faster and smoother, but the risk of failing to stop or swerve for a dangerous object increases. The vehicle must be programmed to stop or swerve only when a detected object poses a danger. If a vehicle fails to identify a person or another vehicle, the question of liability arises – who bears the responsibility for such a mistake? The programmer? The manufacturer? The owner of the vehicle? The human safety driver, if there is one?
The Rise of Technology
As tragic as the Arizona incident was, tests of autonomous vehicles will continue, and eventually, self-driving cars may become a common sight on America’s roads. Human drivers cause millions of accidents every year, and proponents of autonomous vehicle technology believe that self-driving cars will actually be safer than human-driven cars. That may eventually be true, but no technology is fool-proof, and accidents will occur. Insurance companies will have to develop guidelines for insuring autonomous vehicles, courts will have to determine which parties are responsible when an autonomous vehicle does cause an accident, and personal injury attorneys will have to keep up with the latest developments in autonomous vehicle technology in order to be prepared to represent clients who have been injured in such accidents.
If you or a loved one has been hurt in an auto accident, please contact us immediately at 704-769-2316.