Automakers have been testing their own self-driving cars on public roads and trying to launch as soon as possible to get first mover’s advantage. Governments have been granting permissions for testing autonomous vehicles as they are perceived as the promising next-generation technology. The technology might be questioned due to the recent events that ended up taking lives of humans. The fatal accident of Uber’s self-driving car in Arizona took a life of a pedestrian. As a result, its technology came under scrutiny and the company had to halt all the tests at other places. The incident was criticized from different parts of the world and questions about deployment of technology were raised.
In less than two weeks of the Uber incident, another accident took place. However, it did not have the same circumstances, but the technology in question is similar. Tesla Model X electric crossover SUV crashed in Mountain View, California. The accident killed the driver named Walter Huang, an Apple engineer, according to the Bay Area’s ABC 7 News. The vehicle bumped into concrete lane divider at the massive speed and the fire occurred after the crash, killing the driver and destroying the front part of the vehicle.
In a blog post on Friday, Tesla outlined that its autopilot partial self-driving system was engaged during the crash. This acknowledgement spurred inquiry of the National Transportation Safety Board. The agency sent two investigators to investigate the matter. According to its log from the vehicle, the driver’s hands were not sensed on the steering wheel for six seconds prior to the incident. However, Tesla had not mentioned explicitly that its autopilot system was at fault. The incident raises question on its autopilot system, which was also engaged in a crash of Model S Sedan into the truck last year. The incident killed Tesla’s driver.
“Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists,” said Tesla in its post, defending its autopilot system.
Though the autopilot system of Tesla was only the partial self-driving system, which can be interpreted as enhanced cruise control system, the idea of cars driving themselves have come into question. Moreover, the trust of users is wavering in the autonomous car technologies. The technology deployed in the Tesla SUV was an advanced driver-assistance system, capable of handling various aspects of car journey such as lane holding on highways. Tesla’s Chief Executive Officer Elon Musk promoted the technology as the forefront of the autonomous vehicle technology.
The failure of technology, in case of Uber too, outlines that there is a need of human intervention. These incidents indicate that the technology is not ready for deployment on public roads yet. The imperfect version of technology should not be launched on public roads. It also puts regulators in question as they have given permission to test self-driving cars on public roads. Regulators need to scrutinize every aspect of technology before giving permission for testing on public roads. Else, there will be more lives lost due to failure of technology. Though automakers say safety is their topmost criteria, some of the incidents say otherwise. So, instead of rushing to launch the technology as soon as possible, automakers need to ensure every aspect works in accordance with their priority, i.e., safety.