Tesla, founded in 2003, have become the benchmark in the production of viable electric vehicles. The rollout of the Autopilot feature to many Tesla cars worldwide, which essentially allow the cars to drive themselves, has resulted in multiple headlines with questions raised by automotive regulators and Governments worldwide on how safe and mature the technology is. "Can a car really be trusted to drive itself?" they ask.
However, within recent months, a number of accidents involving Teslas in Autopilot mode (and even some that weren't) have also made the news, causing widespread doubt on whether this technology should even have been rolled out in the first place. Tesla claim the technology to be "beta" ⎯ that is, not fully complete and still evolving, and users of the system must accept this on a warning message when they first try to activate the system. Critically, the message warns users that "you need to maintain control and responsibility of your vehicle while enjoying the convenience of Autosteer". The system even continuously monitors the presence of the driver's hands on the steering wheel, and will slow down after an audible warning if the driver leaves the steering wheel hands-free.
In essence, whilst the system may be able to function relatively autonomously in the right conditions, ultimately responsibility and control remains with the driver at all times, who is able to override and take control of the system simply by resuming normal driving. In the case of Joshua Brown, who on the 7th May 2016 was unfortunately killed in a car accident whilst relying on Autopilot, would have been able to prevent the accident had his attention been entirely focused on the road ahead, in the same way that any other driver using standard Cruise Control would be expected to take action to remain safe and in control, and to take preventative measures before their vehicle collided with anything.
What we must remember is that Tesla's Autopilot feature is nothing more than glorified Cruise Control, and that whilst we can label the functionality as "semi-autonomous", the car is by no means entirely self-driving.
Therefore we cannot pin liability of such accidents singularly on Tesla. No self-driving system that exists today is free of flaws, and many only work in the right conditions. Autopilot, for instance, will only work where the lane markings are clear and visible, and should not be used on any route where there are sharp turns to follow. Even Google's self-driving cars, which are fundamentally aimed at being entirely driverless, also have limitations, and have also been involved in road accidents.
It will take decades to find out whether or not self-driving or semi-autonomous vehicles are truly safer than those piloted manually by us, but in the meantime some attention should be drawn to the fact that only a very small number of Tesla vehicles have been involved in Autopilot accidents, fatal or otherwise, compared to the average of 5 people each day that die in road accidents on UK roads in manually-controlled vehicles.