Tesla emergency radar for autopilot in Mannequin 3, Mannequin Y.

The interior of a Tesla Model S is shown in autopilot mode in San Francisco, California, the United States, on April 7, 2016.

Alexandria saga | Reuters

Tesla announced Tuesday that it is dropping radar for its driver assistance features like autopilot.

In a blog post titled “Transitioning to Tesla Vision,” the company said its best-selling Model 3 and Y vehicles, made for customers in the US and Canada starting this month, would instead have a camera-based system to autopilot -Functions such as traffic enable set cruise control or automatic lane keeping.

Radar sensors are relatively expensive and processing data from them requires significant computing power in a vehicle. Tesla previously announced to shareholders that “a vision-only system is ultimately all that is required for full autonomy,” and that it was planning to move the US market to Tesla Vision. CEO Elon Musk also said the company will move to a “pure vision” approach in a tweet earlier this year.

Tesla said these will be the first Tesla vehicles to rely on camera vision and neural network processing to provide “autopilot, full self-driving and certain active safety features.”

The company also warned that autopilot and FSD systems would not be as useful or powerful during this time of engineering adjustments.

“During this transition, vehicles with Tesla Vision may briefly ship with some features that are temporarily restricted or inactive, including: Autosteer is limited to a maximum speed of 75 mph and a longer minimum tracking distance. Smart Summon (if available) and Avoidance of lane departure emergencies can be deactivated on delivery. “

Customers who have already ordered a Model 3 or Model Y but were not aware of this change will be informed before they accept delivery of their vehicles.

All new Tesla vehicles have a standard set of ADAS (Advanced Driver Assistance) features known as autopilot.

Tesla also sells a $ 10,000 premium software package marketed as “Full Self Driving” or FSD. Tesla is offering selected drivers early access to a beta version of FSD – effectively Turn thousands of customers into software testers on the US public roads

According to the company’s website, autopilot is currently enabling a Tesla vehicle to “automatically steer, accelerate, and brake within its lane,” and FSD is adding features such as automatic lane change and summoning that allow a driver to call their car to ask Pick-up to come You can drive through a parking lot with the Tesla app like a remote control.

Tesla advises in its owner’s manual and website that the autopilot and FSD require active monitoring. However, some drivers mistakenly believe that a Tesla is safe to operate hands-free, asleep at the wheel, or even sit in the back of the car.

A Tesla owner who posted social media videos of himself on autopilot with no hands on the steering wheel died in a fatal collision in southern California earlier this month. Another was arrested by the California Highway Patrol for taking his Tesla on unsafe rides where he was in the back seat and driving the car on public highways without a driver behind the wheel.

Mostly with radar and lidar

Other car manufacturers are taking a different approach when developing, introducing and marketing automated driving systems. GM Cruise, Alphabet’s Waymo, Aurora, and others have incorporated radar and lidar into their systems alongside cameras.

While cameras record videos that analysts can label for human data and interpret by machine learning software, radar and lidar sensors provide additional data that cars can use to more robustly detect and avoid obstacles on the road – especially when visibility is poor. even at night or in bad weather.

Musk has called lidar a “crutch” and “fool hunt” and said it was too expensive and difficult to use. But he hasn’t completely fired Radar yet.

Tesla intends to keep the radar in its more expensive Model S and Model X vehicles, as well as Model 3 and Model Y vehicles made in China, or for shipping to markets outside of North America.

According to Phil Koopman, CTO of Edge Case Research and professor of electrical and computer engineering at Carnegie Mellon University, Tesla should be able to offer some functionality via vision today, but may need to reintroduce radar later to deliver more advanced automated functions.

“The sensors used by an SAE Level 2 (human driver who is responsible for monitoring safety at all times) are at the discretion of the manufacturer. Therefore, they may be able to provide at least some functions with just the camera, which should be noted that humans cannot be responsible for handling the camera, “said Koopman.

“Tesla’s functions are currently limited to this SAE level 2. If Tesla wants to achieve SAE level 4 (automated vehicle without monitoring the safety of the human driver – which is currently not possible) in the future, it is advisable to use any type They can get to use sensor, including cameras, radar, lidar and possibly others. “

Comments are closed.