Introduction

In the past months, Tesla has been offering a free, one-month trial of their full self-driving (FSD) system to all current owners. It looks like they are rolling this out in stages, because I only got mine only a few days ago.

I’ve had a Model Y for more than 3 years now, well before Elon revealed himself as the kind of person he really is, and I’ve been happy with it. The odometer is now well above 50,000 miles, a significant part of those were spent on I-80 while driving between the SF South Bay and the Lake Tahoe area.

For long distance interstate driving, autopilot (in other words: lane centering and adaptive cruise control) has been amazing. I use it all the time, and have little to complain about. Early on, I had one case where the autopilot started slowing down for no good reason, but since I distrust these kind of systems and since phantom braking has been reported quite a bit in the press, I try to keep attention to what the car is doing at all times. I immediately pressed the accelerator, and that was that.

I don’t know how prevalent phantom breaking really is. One time is still too many, and disconcerting. It doesn’t help that you can’t anticipate it, there must by a bunch of different factors to trigger it: version of the car, weather, light conditions etc. All I can say, after so many miles, is that autopilot has been amazing for me.

When I buy my next car, my requirements will be simple: I want an EV, an extensive charger network along I-80, and an autosteer that’s at least as good as what I have today. Let’s hope there’ll be decent Tesla alternatives by then.

But let’s get to FSD.

During his last financial conference call, Musk claimed that he wants to focus more on robo-taxis. With no driver in the car at all, such a system better be pretty much flawless. But with the YouTube videos that are out there, often posted by Tesla fans, that show the system making ridiculous errors, I highly doubt that the system is close to ready. I would never pay for FSD, not only do I not trust it, I also don’t really see the point, but with with a free trial, I couldn’t resist checking it out.

Here are my impressions.

Rules of Engagement

During 3 short tests, I watched FSD the way a helicopter parent watches a toddler who’s first exploring the world: allow it do what it wants to do, but intervene the moment you feel things are not going the way you like it.

It’s common on social media to see comments like this: “if the driver had waited a bit more, FSD would still have corrected itself.” I’m having none of that. The moment I’m sensing it’s on its way to do something, anything, wrong, I intervene.

While it’s possible that benign cases are dinged as interventions, I don’t think any what I describe below can be considered as such. They were real mistakes that should never have happened.

Test Ride 1: from Kings Beach to Truckee (11 miles)

I first switched on FSD for an 11 mile drive from a mountain biking trailhead in Kings Beach to the I-80 entrance in Truckee, with a stop at a gas station to get some snacks.

Map from Kings Beach to Truckee Click to open in Google Maps

This is not a complicated tasks. Other than the gas stop, it’s just driving straight along state route 267 with a 3 traffic lights. What could possibly go wrong? Well, FSD managed to make 2 mistakes.

Mistake 1: select the wrong exit lane

During the first mistake, instead of turning right at the gas station, it made the decision to prepare to exit one street early, switched on its indicator, and started moving to the right exit lane. Note that there is no way to get to the gas station through that first exit.

Mistake one: go right too early

Mistake one: streetview

If I hadn’t immediately interrupted that maneuver (see Rules of Engagement), I assume it would have corrected itself eventually and gone back onto the main lane. But if I had been the driver behind, I’d have questioned the antics of the driver in front of me.

FSD managed to screw up its very first maneuver. Not a good look.

Mistake 2: selecting the right turn lane when going straight

The second mistake happened less than a mile later.

At the intersection with Old Brockway Rd, the car was supposed to continue straight. There are 3 lanes at the traffic light: left, middle, and right, and only the middle lane can be used to go straight.

Mistake two: streetview

For whatever reason, FSD initiated a move to go to the right lane. Another case where I’m sure it would have corrected itself eventually, but it’s clear that the system had no clue about the traffic situation in front of it.

While both cases are not life-or-death situations, it’s truly impressive that FSD managed to make 2 easily avoided mistakes before my first 10 miles of using it!

Test Ride 2: I-80 from Truckee to Blue Canyon (36 miles)

For the second test, my wife reluctantly gave me permission to try out FSD for interstate driving, which should be its best case scenario.

Map from Truckee to Blue Canyon Click to open in Google Maps

It’s a bit disconcerting to see the car make a decision to change lanes to pass someone, but I guess that’s something you’ll get used to.

But what was baffling was the way in which it behaved worse than autopilot. There were two nearly identical cases, where the 2-lane road was dead straight, with excellent paint marks, and with cars right of me, yet FSD made nervous left-right oscillation-like corrections that I have never experienced before in autopilot mode. It was not a case of FSD wanting to change lanes, no right indicator was ever switched on.

Mistake three: streetview

The first time, my wife questioned what was going on. The second time, on a section just past the Whitmore Caltrans station near Alta, she ordered me to switch off FSD. In the past 3 years, she never once asked me to switch off autopilot.

One would think that autopilot and FSD have the same core lane tracking algorithms, but one way or the other the experience was radically different. I switched back to autopilot. The remaining 3 hours were uneventful.

Test Ride 3: from West-Valley College to I-85 Entrance (1 mile)

The final test happened yesterday, while driving back from the Silicon Valley Electronics Flea Market back home.

These are always held on a Sunday, start very early at 6am, and I’m usually out before 9am, so there’s almost nobody on the road.

Map from West Valley College to Route 85 Canyon Click to open in Google Maps

FSD managed to turn right out of the parking lot just fine and get past the first traffic light.

Mistake four: streetview

The second traffic light has a don’t-turn-on-red sign. The light was red, the Tesla came to a full stop, and then pressed on the gas to move on while the light was still red. (According to my colleague, police often lay in wait at this location to catch violators.)

By now I fully expected it to make that mistake, so I was ready to press the brake.

Conclusion

The way it currently behaves, FSD is a system that can’t be trusted to make the right decisions. It makes the most basic mistakes and it makes many of them.

Without FSD, you pay attention to the road and everything else is within your control. With FSD, you still need to pay attention but now there’s the additional cognitive load to monitor an unpredictable system over which you don’t have direct control. Forget about just being focused, you need to be hyper-focused, and you need to pay $99 per month or a one time fee of $12,000 for the privilege. With the limited functionality of autopilot, you hit the sweet spot: adaptive cruise control and lane centering work reliably, and you don’t need to worry about any other mischief.

Maybe one day I’ll be able to drive to Lake Tahoe by typing in the address, sit back, take a nap, or play on my phone. Until then, it’s just a fancy technology demo with little practical value.