Tesla reaffirms computer vision commitment despite regulator encroachment

The news: At its second annual “AI Day,” Tesla announced that it is developing a “D1” chip for its Dojo supercomputer, reaffirming its commitment to using only computer vision for driverless vehicles and removing its dependence on lidar.

  • The “D1” chip sifts through massive troves of data collected from Tesla camera sensors and trains its AV neural network.
  • Tesla director Ganesh Venkataramanan said the company is moving towards owning and creating as much of its tech stack as possible.

More on this: More quixotically, Tesla also said it’s developing a bipedal humanoid AI robot codenamed “Optimus.” Tesla did not share a use case for the robot, but said it could perform tasks either unsafe or too “repetitive” for humans.

  • The company has experience in robotics, deploying self-navigating Autonomous Indoor Vehicles in its Gigafactory.
  • It’s part of a push to be seen as “much more than an electric car company,” per TechCrunch.

How we got here: Tesla announced it was building a supercomputer to train its AV neural networks in June this year.

The elephant in the room: Tesla’s commitment to computer vision-based autonomous driving occured under the shadow of mounting regulatory and legal pressure.

  • Earlier this week, The National Highway Traffic Safety Administration (NHTSA) launched a probe into Tesla’s Autopilot driver assistance feature.
  • Just days later, two senators sent a letter to FTC Chairwoman Lina Khan asking the agency to determine whether or not the company engaged in deceptive advertising of the feature.

What’s next: Tesla could be on a collision course with regulators over its decision to continue training its neural network on open roads, without large public buy-in.

  • Safety advocates sounded the alarm on Tesla’s Full-Self Driving (FSD) feature, saying it’s not ready for mass use and endangers the lives of others who did not consent to Tesla testing the feature on public roads.
  • Meanwhile, CEO Elon Musk has essentially confirmed these concerns, recently advising Tesla drivers to “be paranoid,” saying FSD “may do the wrong thing at the worst time.”

With all this in mind, the NHTSA may ramp up efforts to regulate where and how Tesla can test is new driver assistance updates, which could potentially slow down the volume of raw data the company can use to train its neural network.

 

"Behind the Numbers" Podcast