Incident 145: Tesla's Autopilot Misidentified the Moon as Yellow Stop Light

Description: Tesla's Autopilot was shown on video by its owner mistaking the moon for a yellow stop light, allegedly causing the vehicle to keep slowing down.
Alleged: Tesla developed and deployed an AI system, which harmed Tesla drivers.

Suggested citation format

Lam, Khoa. (2021-07-23) Incident Number 145. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
145
Report Count
3
Incident Date
2021-07-23
Editors
Sean McGregor, Khoa Lam

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

Incident Reports

Hey @elonmusk you might want to have your team look into the moon tricking the autopilot system. The car thinks the moon is a yellow traffic light and wanted to keep slowing down. 🤦🏼 @Teslarati @teslaownersSV @TeslaJoy

Video showing moon interpreted as yellow stop light

A viral tweet showed a Tesla confusing the moon for a yellow traffic light.

The company's Full Self-Driving tech has made similar mistakes before.

Owners say their cars have been fooled by billboards and Burger King signs.

Tesla CEO Elon Musk has been promising for years now that fully autonomous vehicles are just over the horizon.

But his company's Full Self-Driving technology still doesn't live up to its name. And despite Musk's goal to eventually make the system safer than a human driver, it currently fails at things any human driver wouldn't think twice about.

For example, one Tesla owner posted a clip to Twitter on Thursday that showed his vehicle mistaking the moon for a traffic light over and over again as he cruised down the highway.

The clip shows a moon that's unusually yellow and low in the sky, so one can see why the car might register it as a yellow light and apply the brakes. Another owner replied with their own video showing the same scenario.

Still, the moon is a constant — not an extreme case — and it's something any self-driving system worth its salt should identify as being hundreds of thousands of miles away, not just up ahead. Plus, having a car slow down unexpectedly at highway speeds could create a dangerous situation for the driver and surrounding traffic.

To be fair, Tesla doesn't claim that Full Self-Driving can perfectly react to traffic lights and stop signs yet. The "Traffic Light and Stop Sign Control" feature, released via a software update in early 2020, is still in beta, and owners have to switch it on manually.

But it's not just the moon that's fooling Teslas. Owners have also reported their vehicles mistaking the sun for a red light. And one odd clip shows a Tesla get bamboozled by a truck hauling traffic lights.

In April, a Tesla owner noticed his car would always come to a full stop in the same spot in the middle of a road. A video he posted shows his vehicle mistaking an image of a stop sign on a billboard for the real thing.

Soon after Tesla launched the traffic light-sensing feature in April 2020, one Tesla owner posted a video of his car mistaking a roadside Burger King sign for a stop sign. The car begins to slow down on the 40-mph road before realizing there's no stop sign and carrying on.

Tesla solved the Burger King bug in a subsequent software update. Still, the fast food chain attempted to capitalize on the glitch by launching a free Whopper promotion for any Tesla owner whose car accidentally stopped at a Burger King sign.

In July, the carmaker launched a subscription option for Full Self-Driving, knocking its price from a $10,000 upfront payment to as little as $99 per month. But, as these bugs show, the system doesn't make cars autonomous and it's still not clear when it will.

In the meantime, it's arguably safer for Teslas to recognize too many traffic lights and stop signs than too few.

Tesla's Full Self-Driving tech keeps getting fooled by the moon, billboards, and Burger King signs

Washington, DC (CNN)When a dozen small children crossed in front of our Tesla with "full self-driving," I had good reason to be nervous.

I'd spent my morning so far in the backseat of the Model 3 using "full self-driving," the system that Tesla says will change the world by enabling safe and reliable autonomous vehicles. I'd watched the software nearly crash into a construction site, try to turn into a stopped truck and attempt to drive down the wrong side of the road. Angry drivers blared their horns as the system hesitated, sometimes right in the middle of an intersection. (We had an attentive human driver behind the wheel during all of our tests, to take full control when needed.)

The Model 3's "full self-driving" needed plenty of human interventions to protect us and everyone else on the road. Sometimes that meant tapping the brake to turn off the software, so that it wouldn't try to drive around a car in front of us. Other times we quickly jerked the wheel to avoid a crash. (Tesla tells drivers to pay constant attention to the road, and be prepared to act immediately.)

I hoped the car wouldn't make any more stupid mistakes. After what felt like an eternity the kids finished crossing. I exhaled.

We were clear to make our turn. The car seemed overly hesitant initially, but then I noticed a bicyclist coming from our left. We waited.

Once the bicyclist crossed the intersection, the car pulled up and made a smooth turn.

Over the past year I've watched more than a hundred videos of Tesla owners using "full self-driving" technology, and I've spoken to many of them about their experiences.

"Full self-driving" is a $10,000 driver-assist feature offered by Tesla. While all new Teslas are capable of using the "full self-driving" software, buyers must opt into the costly addition if they want to access the feature. The software is still in Beta and is currently available to only a select group of Tesla owners, though CEO Elon Musk has claimed that a wider rollout is imminent. Musk promises "full self-driving" will be totally capable of getting a car to its destination in the near future.

But it doesn't do that. Far from it.

Tesla owners have described the technology as impressive but also flawed. One moment it's driving perfectly, the next moment it nearly crashes into something.

Jason Tallman, a Tesla owner who documents his "full self-driving" trips on YouTube, offered to let me experience it first-hand.

We asked Jason to meet us on Brooklyn's Flatbush Avenue. It's an urban artery that funnels thousands of cars, trucks, cyclists and pedestrians into Manhattan. For even experienced human drivers, it can be a challenge.

City driving is chaotic, with vehicles running red lights and pedestrians on nearly every block. It's a far cry from the suburban neighborhoods and predictable highways around Tesla's California offices, or the broad streets of Arizona, where Alphabet's Waymo operates fully autonomous vehicles.

Cruise, GM's self-driving company, recently completed its first fully autonomous rides in San Francisco. But they were conducted after 11 p.m. at night, when traffic is light and few pedestrians or cyclists are present.

Brooklyn offered us a chance to see how close Tesla's autonomous driving software was to replacing human drivers. It's the sort of place where humans drive because they have to, not the sort of place selected by a corporate headquarters. It's where self-driving cars might have the biggest impact.

At one point we were cruising along in the right lane of Flatbush. A construction site loomed ahead. The car continued full speed ahead toward a row of metal fencing.

I felt deja vu as I recalled a video in which a Tesla owner slammed on the brakes after his car appeared set on crashing headlong into a construction site.

But this time I was sitting in the back seat. I instinctively threw up my right arm like the Heisman Trophy, as if to protect myself in a collision.

That was a moment I wished "full self-driving" would be quick to change lanes. In other cases, I wished it would chill out on its aggressive turns.

"Full self-driving" sometimes makes jerky turns. The wheel starts to turn, but then shifts back, before again turning in its intended direction. The staggered turns generally don't seem to be a bother on sweeping suburban curves, but in a dense city largely built before cars, it's uncomfortable.

There's also the braking, which can feel random. At one point a car came close to rear ending us following braking that surprised me. Getting honked at was common. I never quite felt like I knew what "full self-driving" would do next. Asking "full self-driving" to navigate Brooklyn felt like asking a student driver to take on a road test they weren't ready for yet.

What "full self-driving" could do well was impressive, but the experience was ultimately unnerving. I can't imagine using "full self-driving" regularly in a city. I noticed I was reluctant to ever look down at the Model 3's dashboard, such as for checking our speed, because I didn't want to take my eyes off the road.

Tesla owners routinely tell me how Autopilot, the highway-focused predecessor to "full self-driving" makes their trips less stressful. They arrive at destinations feeling less fatigued. Some have told me they're more likely to go on long road trips because of Autopilot.

But "full self-driving" felt like the inverse. I felt like I needed to be constantly on guard to prevent the car from doing something wrong.

Ultimately, seeing "full self-driving" in Brooklyn reminded me of the importance of the finer points of driving, which is tough for an artificial intelligence powered car to master. Things like pulling slightly into the intersection on a narrow road to make a left turn, so traffic behind you has room to pull around. "Full self-driving" just sat in place as frustrated drivers behind us honked.

For now, "full self-driving" seems closer to a party trick to show friends than a must-have feature.

We tried Tesla's 'full self-driving.' Here's what happened

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents