Advertisement

This is why Elon Musk is right about Tesla's 'ChatGPT moment'

 Tesla Model 3
Tesla Model 3

Are we just a year away from 3 million fully-autonomous, self-driving cars joining us on highways, city streets, and in suburbs? If you believe Tesla CEO Elon Musk, absolutely.

Musk told CNBC in a recent post-earnings interview that "I think Tesla will have sort of a ChatGPT moment maybe if not this year, I’d say no later than next year."

And that moment could lead to "suddenly 3 million cars will be able to drive themselves with no one," said Musk.

Those cars, the roughly 3 million Teslas currently on the road worldwide and all running AI trained by Tesla's Dojo supercomputer would then make a leap from what is arguable Level 2 or 3 autonomy to the hands-free, sit in the back of the vehicle and let it do the work Level 5.

Always the optimist

Granted, the self-described "pathological optimist" Musk has been promising full autonomy via a software upgrade for years. Still, there is something to the concept of a ChatGPT moment.

AI development has this habit of going slow and then experiencing a rapid acceleration. ChatGPT's transformation from a small and maybe poorly understood OpenAI project into an online sensation will not be the last time AI makes that kind of fundamental leap.

Tesla's work in AI, which is mostly used to train self-driving systems to understand an untold number of driving situations, has led to one of the most advanced autonomous systems on the road. A Tesla car can use AutoPilot to drive on a highway and, according to Musk, navigate a busy city like San Francisco. As the name suggests, Summon lets you call the car from a relatively short (probably your phone's Bluetooth range) distance. Teslas use AI in battery management and for tying autopilot to its navigation system.

In other words, most Teslas are well on their way to full autonomy. However, while the EVs seem well-equipped to do all the driving for you, they may be more like an over-eager teenager who insists they're ready to drive on their own but may lack some of the necessary defensive driving skills that come with time and experience.

Is it safe?

Earlier this year, the National Highway Traffic Safety Administration (NHTSA) recalled over 350,000 Teslas claiming that the Autopilot system might pose a safety risk. From the recall report:

..."with FSD [Full Self-Driving] Beta engaged, certain driving maneuvers could potentially infringe upon local traffic laws or customs, which could increase the risk of a collision if the driver does not intervene."

A recall, by the way, does not mean that anyone is bringing their EVs back to Tesla. In fact, the fix is another software update. From the recall report: "Tesla will deploy an over-the-air (“OTA”) software update at no cost to the customer. The OTA update, which we expect to deploy in the coming weeks, will improve how FSD Beta negotiates certain driving maneuvers during the conditions described above."

That's the thing about these autonomous systems. Most semi-autonomous vehicles have all the hardware necessary to drive without human intervention. It's a combination of software, self-imposed limits, and local laws that keep Teslas and other self-driving cars from driving themselves.

I have no doubt that the software is quite close to getting us the FSD Musk has been promising but I can also see the need for a true ChatGPT moment that adds an almost human-like contextual awareness to self-driving automobiles. That, as far as I can tell, is the difference between you driving and an AI.

Autonomous driving systems can be fooled by bikes, cones, and overly reflective semi-trucks. Humans who are paying attention aren't fooled. More training will help but the reality is a successful FSD Tesla has to think like a person, or a least a person who is trained to drive carefully. Tesla's ChatGPT moment could be that trigger and make it all possible. Or it could be just another empty Elon Musk promise.