Starlink and T-Mobile’s partnership will be revolutionary for cellular service and Smarter AI CEO Chris Piche had some thoughts on how the new partnership will impact 5G capability for the automotive industry.
Chris, who has created services including AT&T TV, BBM Video, Poly Video, and STUN/TURN/ICE shared his thoughts on the effect of 5G on vehicles and telecommunications in an interview with Teslarati.
AI CAMERAS, TESLA, STARLINK & AUTONOMOUS VEHICLES
Before founding Smarter AI, the Top 40 under 40 entrepreneur’s company created a technology that BlackBerry licensed to enable voice and video calling. This gave Chris a front-row seat to witness the speed at which technology can transform markets.
Smarter AI is a software platform for artificial intelligence cameras.
“Smarter AI is to cameras as Android and iOS are to phones,” he told me. The company’s first vertical market is focusing on transportation. Vehicle camera systems such as dash cams or other camera systems for larger vehicles are in this market.
“The connection here with Tesla, Starlink, and T-Mobile is all around autonomous transportation. Today’s autonomous transportation whether it’s in Tesla or another kind of vehicle all relies on line of sight situational awareness. In Tesla’s case, they rely on some cases exclusively and other cases primarily on cameras and computer vision to try to understand what’s happening around the car.”
“Many of their competitors use LiDAR and don’t rely on cameras. But in both cases, it’s all based on line of sight. What they can actually see in a straight line.”
SEEING BEYOND THE LINE OF SIGHT
Chris told me that one of the new technologies that Smarter AI and other companies are developing is called vehicle to vehicle (V2V) or vehicle to everything else (V2X).
“These technologies enable cars to see beyond line of sight. Imagine you’re coming to an intersection and are planning to take a turn.”
Instead of waiting to see what’s ahead of you on the street, you’re turning on to, the technology will tell you exactly what is ahead. There could be a stopped car, a pedestrian about to jaywalk, or some type of temporary obstruction that you are unaware of.
“Imagine if there was a camera system located at the intersection. Imagine that as your vehicle is approaching that intersection, your vehicle could communicate with the camera and the camera could tell your vehicle that there’s some sort of obstacle.”
An autonomous vehicle would use this information to determine whether or not it can make that turn. This technology, Chris told me, relies on high-capacity and high-availability communications networks such as 5G.
STARLINK & T-MOBILE’S PARTNERSHIP COULD HELP WITH THE CHALLENGES OF IMPLEMENTING V2V AND V2X
“One of the challenges with implementing technologies like V2V or V2X on top of 5G is that 5G deployments tend to be pretty good and getting better in large urban areas.”
5G is pretty spotty in Baton Rouge and personally, 4G LTE works faster than 5G does for me although there’s a tower across the street from me. Chris, who is in Las Vegas, said that the coverage is pretty good for his friend with AT&T. He doesn’t have AT&T and his coverage is pretty spotty like mine is.
“But this agreement with Starlink and T-Mobile has the promise or the potential to either eliminate or significantly reduce the spottiness in the 5G coverage and that will enable technologies that are designed on top of 5G such as V2V and V2X to work either more reliably in urban areas where 5G is already available but is a little bit spotty,” he said.
“It would also enable these technologies to work in other areas where there is no 5G. We think this is a really significant announcement in terms of the promise of autonomous transportation and bringing it much closer to being a reality.”
HOW V2V AND V2X COULD IMPROVE TESLA’S AUTOPILOT
Chris told me he’s been using Tesla’s Autopilot for around five years.
“It’s so good. It’s to the point that for the things it can see, it’s a way better driver than I am,” he said adding that when he drives for over a couple of minutes, he engages Autopilot. However, there are a couple of things that it lacks.
“It can’t see that far ahead and it lacks context. Sometimes, if there’s a car making a turn in front of my car, the Autopilot won’t understand the context that maybe this other car is momentarily in front of mine. And if I was driving, I’d keep driving. I wouldn’t take my foot off the accelerator or slam on the brakes unless I could see that something was going wrong with the turn that the other car was making.”
One way to improve Autopilot is through V2V or V2X, Chris explained.
“In V2V, my car would talk to the car that’s making the turn in front of me and they would orchestrate the speed and direction of both of the cars so that the car in front of me could make its turn and my car could continue driving without slamming on the brakes.”
“With V2X, that would enable my car to talk to the cameras, traffic lights, and intersections to gain situational awareness about either other cars that aren’t equipped with the same technology or about other objects such as bicycles, pedestrians, or other obstacles on the street.”