In theory, we are getting closer and closer to a wide release of Tesla’s beta feature-complete “Full Self Driving” package. If you’re familiar with all of the background on this, skip to the third paragraph. If not, here’s the short story: Tesla has been working for years to increase its vehicles’ autonomous driving features. From not running over cats & dogs, to being able to change lanes on an Interstate highway without playing bumper cars, to being able to navigate through parking lots solo, Tesla cars have been getting more and more capable of autonomous driving. Two years ago, CEO Elon Musk indicated that Tesla vehicles with the “Full Self Driving” (FSD) package would have “feature complete” autonomous driving abilities by the end of the year (the end of 2019). They wouldn’t yet be ready to roam the streets as robotaxis, but they’d be able to drive from door to door on their own — with human supervision. It turns out, though, that Tesla’s method at the time had some software ceilings that just weren’t going to make that practical.
Tesla implemented a thorough rewrite/restart on its software approach to autonomous driving that took several months (or about a year). And then more hurdles popped up. In order to get to the level Tesla needed for wide release of the feature-complete FSD package, as Elon recently put it, it’s been a matter of two steps forward, one step back — over and over again. The running joke among many Tesla fans is that wide release of the door-to-door FSD features has been ~2 weeks away for a year, or nearly two years. At the moment, based on the last update, it’s ~3 weeks away. In the meantime, there are a few thousand special Tesla owners who have had the capabilities in their cars for the past several months and have been sending feedback to Tesla to help it refine the system. With each update, some of them have been posting videos on YouTube and Twitter. That’s where we can bring the others back in.
A couple of weeks ago, Tesla rolled out the v9.2 update of its FSD Beta system. Curious about how the self-driving tech is coming along, inspired by AI Day, and eager to potentially get the door-to-door FSD features in my Tesla Model 3 in a few weeks, I’ve been watching some of the recent videos. Below are a few good ones and some notes underneath them for anyone who is morally opposed to clicking play on 15 minute YouTube videos or even a sped-up 2:19 Twitter video.
In this first one, Chuck Cook starts off by tackling one of the biggest challenges he’s seen FSD struggle with. He tries to get his Model 3 to take a left turn across three lanes of traffic, through an opening between the median, and into the traffic flowing left. The car doesn’t feel comfortable trying this on two attempts and decides to turn right instead. I can understand deciding to skip that turn — I would typically avoid making a turn like that — but the thing I found odd was that after making the right turn, instead of getting over in the left lane to try to turn around on the desired route, it reroutes into a loop that would just put the car right back to the place where it failed to make the unprotected left turn. Odd.
Chuck then goes and finds a similar spot. He says the visibility is a bit better at this spot. He gets the car to make the left turn after much hesitation, but he has to tap the accelerator a couple of times to give it the encouragement to go. He goes back to the spot again and it makes the turn without any nudging when it sees a nice opening. He also indicates in the video that he likes how quickly the car accelerates once it makes the turn, but he wonders what the car would do if there was traffic on the other side of the median but not before the median — if the car would cross over and wait in the median area until there was an opening on the other side, or if it would just decide the turn was too challenging and avoid crossing altogether. On the third attempt in the same spot, the car again has an opening in both direction and makes the turn.
Next up, Chuck decides to see if the car will nicely leave this partially divided highway and turn left across three lanes of traffic safely. It’s drizzling a bit and the car starts driving quickly into the turn. It doesn’t cross any lines, but it is driving quite fast and there’s fast oncoming traffic. Chuck doesn’t like what he’s seeing and he decides to intervene to make sure the car doesn’t drive into traffic. (I would have done the same.) Even if Chuck hadn’t intervened and the car had stopped itself before causing an accident, it seems that it would have stopped too suddenly for the driver’s/passenger’s comfort. He goes back to make the attempt again and the car waits for traffic and makes the turn pretty much perfectly. In a third attempt, the car goes back to the tendency it had in the first scenario and seems intent on zipping across the lanes when it’s not safe to do so, then it apparently sees the oncoming vehicles and makes a sudden stop before crossing any lane markings. Though, the activity freaked out an oncoming pickup driver enough that the truck quickly started swerving into the lane to its right. Then he sees the Tesla stop and swings back. Did the pickup truck driver check carefully before starting to move into the lane next to it? I hope so, but the whole segment comes across as a bit sketchy.
Then Chuck returns to the unprotected left turn attempts. Visibility is bad on the left, but a big gap eventually appears and the car cautiously starts to make the turn. Lines on the touchscreen indicate it’s going to turn left, but then the car gets a bit spooked (presumably about cars coming from the right) and decides to bail on the navigation’s plan and turn right instead. Whoops. Chuck believes that the system thinks it needs to have clear traffic in both directions to make these turns, even though it could stop in the middle. Perhaps it just sees that middle space as inadequate, but it’s not clear exactly why it won’t go sit in that middle spot and wait for the second opening. In yet another attempt, the car is waiting too long and traffic is building up behind Chuck, so he taps the accelerator, which encourages the car to go ahead and make its move, when he sees an opening. The car then executes the move perfectly — the maneuver through the median as well as the acceleration onto the highway once it reaches its lane.
The car then makes a nice smooth left turn off of the highway across three lanes of traffic going in the opposite direction. And then it executes that same turn well again. Then, a third time, it makes the turn but in a clumsy stop-and-start way that is a bit akin to a teenager learning to drive while making their parents sweat like a pro tennis player in the US Open.
In this video, published the same day as the first video but recorded a little earlier, the first big highlight is that Chuck says the car accelerates much better once getting onto a specific highway than it did before. This made him very happy.
The next tricky scenario he tests is a red light at a bit of a messy intersection with a green right-turn light that previously convinced his car to go through the red even though he shouldn’t have. With the v9.2 update, the car better distinguished between the lights and stopped at the red light even as the right-turn light was green.
Okay, there’s another 20 minutes in the video, but this stuff is really better watched than read about, so jump into the video and watch a bit!
This last short snippet is sped up to make the video shorter, but you can see in the tweet that the owner’s Model 3 went from San Francisco to SFO Airport with just one disengagement. Not too shabby. Not quite robotaxi ready, as Waymo’s vehicles apparently are in San Francisco, but very impressive.