views
Tesla in May started delivering Model 3 and Model Y with a driver-assist system based on eight cameras mounted around the car and with no radar. The cameras, like eyes, send images to computer networks, like the brain, which recognize and analyse objects. Over the years, Tesla’s view of radar has changed.
In May 2016, a Tesla car crashed, killing the driver, when Autopilot failed to detect a white semi-truck crossing in front of it.
Later that year, Tesla announced a plan to give radar a primary role in navigation while describing a false-alarm problem with some radar systems that needed to be fixed.
“Good thing about radar is that, unlike lidar … it can see through rain, snow, fog and dust," Musk tweeted in 2016. Tesla also said radar “plays an essential role in detecting and responding to forward objects." Tesla does not use a more expensive lidar sensor, which gives more precise shape information of an object than radar.
Tesla drivers complained of “phantom braking" when their cars stopped abruptly on highways under an overpass or a bridge.
Musk said the new camera-only system will likely be safer than radar because of less “noise" or confusing signals, industry news site Electrek reported.
After the May 2016 crash, Tesla had similar accidents of cars crashing into semi-trucks and stationary police cars and fire trucks. The National Highway Traffic Safety Administration is currently investigating 24 accidents involving Tesla cars.
HOW DO OTHER SELF-DRIVING TECHNOLOGIES WORK?
Most automakers and self-driving vehicle companies such as Alphabet Inc’s Waymo use three types of sensors: Cameras, radar and lidar.
Radar systems, like cameras, are relatively inexpensive. They work in poor weather but lack resolution to accurately determine the shape of objects. Lidar has higher resolution, but is vulnerable to weather conditions.
“You need to use all the different kinds of sensors and then combine them," said Raj Rajkumar, professor of electrical and computer engineering at Carnegie Mellon University, reflecting a common industry view.
Tesla’s camera-centric system is “much harder to design, but it is also much cheaper" than Waymo’s laser-based lidar approach, enabling the electric car maker to scale up and further improve its technology, Tesla’s artificial intelligence director, Andrej Karpathy, said in a “Robot Brains" podcast in March.
WHAT DOES TESLA LOSE BY GIVING UP RADAR?
There is a lot of debate on this issue.
The loss of radar degrades driver-support features enough “to render them less usable to unusable in adverse weather conditions," Steven Shladover, a research engineer at Berkeley University of California.
“It makes no sense whatsoever technologically – only a way of reducing cost of components," he said.
Since radar is good at measuring distances accurately, its loss could affect emergency braking to avoid collisions with slowed vehicles, said Ram Machness, chief business officer of advanced radar maker Arbe Robotics.
“If you drop radar without proving vision alone does that job as well, then you’re compromising safety," Telanon, a developer of a driver support system, said on Twitter.
Also Watch:
Tesla said some driver-assistant features including its ability to maintain speed at the pace of the car ahead may be temporarily limited or inactive upon delivery. It said it would start restoring the features via software updates in the weeks ahead.
Musk told Electrek that the vision system had improved so much that it was better off without radar.
Last week, NHTSA withdrew its advanced safety features label for new Model 3 and Model Y vehicles and Consumer Reports dropped its “top pick" label. Both intend to test the vision-only system.
WHO KNOWS BEST?
Tesla’s plan runs counter to most of the self-driving industry, but it is difficult to say who is right. No company has yet deployed a fully functional self-driving system in scale, and the entire industry is years behind initial projections.
Read all the Latest News, Breaking News and Coronavirus News here.
Comments
0 comment