I hate to break it to the lowest common denominator of TikTok, but drunk-driving and open container laws still apply even if Autopilot takes the wheel…with no one behind the wheel. Thanks for posting the evidence on the internet, though!
That’s bad enough, but they one-upped their own dumb behavior by leaving the driver seat empty as the car went 65 mph down the highway. Yikes.
We can’t embed it on our site, but you can view the full video on TMZ here, which has the bonus of not giving the original evidence video the attention-clicks. That’s what this is to The Man, you know: evidence.
Other TikTokers thankfully know better and are giving this the extreme side-eye it clearly deserves thanks to the app’s split-screen “duet” feature. A personal injury lawyer, Attorney Tom, chimed in to say that if you get in an accident while using Autopilot, you will be sued anyway. (Tom’s stare pretty much said it all.) Some joked that the dudes roped in a recently deceased celebrity like Kobe Bryant or Juice Wrld to drive for them. Others just posted the obvious: a photo of a cop staring at the screen, and a photo of an article about a guy who got arrested for drunk driving after falling asleep with Autopilot on.
“Natural selection,” commented Billy Mueller on another TikTok roasting the drunk-ghost-ride-the-Tesla vid. “This is how God makes us smarter as a species. People drink White Claw and let Elon take the wheel.”
Others on TikTok simply wondered who would get the ticket if they got pulled over, but the answer is clearly “everybody,” thanks to laws in the U.S. against open containers of alcohol in the car.
This isn’t the first time blurrblake has posted reckless behavior with the Tesla on his TikTok account, which now appears to have changed its name to blurr.tv. He has another video up showing a teddy bear behind the wheel with a dude reclining in the front passenger seat. “#viral!”
Someone’s mom to take Blake’s toys away for a while.
You shouldn’t need to have other TikTokers say this because Tesla said it themselves: Autopilot—despite its name that seems to imply otherwise—is meant as a driver assist more than anything and requires a person behind the wheel while it’s in use who can take over in case Autopilot malfunctions or makes a mistake.
That’s not just a warning that applies to Autopilot. There are no actual self-driving systems on the road. They all require you to pay attention, and you will get in big legal trouble if you get caught with no one behind the wheel, much less an empty seat and a bunch of open hard seltzers. (Sheesh, both their judgement AND their taste in booze is terrible.)
Unfortunately for the responsible drivers out there, this kind of reckless behavior puts everyone you share the road with at danger. There have been many other crashes already where drivers put too much faith in Autopilot’s driving skills. Please, just call someone sober—as in, an actual, live person who isn’t Kobe, Jesus or Elon’s army of programmers—to drive you home.
Reported by TMZ.
Want to buy a Tesla Model 3, Model Y, Model S, or Model X? Feel free to use my referral code to get some free Supercharging miles with your purchase: http://ts.la/guanyu3423
You can also get a $100 discount on Tesla Solar with that code. Let’s help accelerate the advent of a sustainable future.