According to Ashok Elluswamy, Tesla’s director of Autopilot software, Tesla’s much-hyped video of its Autopilot driver assistance system “driving by itself” from 2016 wasn’t actually driving itself.
In a recent statement, Elluswamy said so the video titled “Full Self-Driving Hardware on All Teslas” was intended to “display what was possible to build the system” rather than what customers could actually expect from the system.
The video, which Tesla CEO Elon Musk tweeted a link to by saying “Tesla drives itself”, a Tesla is shown driving and parking itself, dodging obstacles and obeying red and green lights. The video begins with a title card stating that “the person in the driver’s seat is only there for legal reasons” and that “he’s not doing anything. The car will drive itself.”
But according to Elluswamy, the demo was “specific to a predetermined route,” compared to the production version of the technology that relied solely on input from cameras and sensors. “It used additional pre-mapped information to drive,” he said, after telling lawyers that the route the car followed had previously been mapped in 3D. At the time the video was made, Elluswamy was an engineer on the team that helped with the video.
In other words, Tesla’s Autopilot was not capable of dynamic route planning, but required the company’s engineers to map out the route it would take for the purposes of the promotional video.
The New York Times had previously reported the premapping, pointing out that consumers using the system would not have that luxury, but now we have it on the record of a Tesla official. The statement, which you can read in full below, was made as part of a lawsuit filed by the family of Wei “Walter” Huang, who died in 2018 when his Model X crashed into a roadblock with autopilot engaged.
Elluswamy also said that the version of Autopilot available when the video was produced was “uncapable of handling traffic lights” despite being shown in the video. What isn’t clear is exactly how the video was made; Elluswamy says he can’t remember if the person in the driver’s seat controlled acceleration or braking or if the car was working. It is also not clear whether the car had software that could recognize traffic lights.
The confession isn’t the only part of Elluswamy’s testimony that raises eyebrows. Mahmood Hikmet, Head of Research and Development at Ohmio Automation, marked parts of the transcript where Elluswamy said he knows nothing about basic security considerations such as Operational Design Domain, otherwise known as ODD.
The phrase refers to situations, such as geography or weather, in which an autonomous vehicle is allowed to drive. For example, if an autonomous vehicle can only drive in a certain city in ideal weather conditions, then a rainy day in another city falls outside its ODD.
While you wouldn’t expect the phrase to appear in everyday conversation or marketing materials, it’s certainly something you expect the person running the Autopilot program to know about. The Society of Automotive Engineers (SAE), the organization behind the levels of autonomy Which Tesla itself has referenced itshouts ODD “the key to the safety of autonomous vehicles‘, and Waymo publish an entire study evaluating the performance of its software in a specific domain.
Musk seemed to show contempt for thinking about ODD during a podcast appearance with Lex Fridman. He said the acronym “sounds like ADD,” then answered a question about the philosophy behind Tesla’s expanded ODD (compared to other systems like GM’s Super Cruise, which will only work under certain circumstances) by saying it’s “pretty crazy” to let people drive cars instead of machines.
Janice has been with businesskinda for 5 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider businesskinda team, Janice seeks to understand an audience before creating memorable, persuasive copy.