[ad_1]
A federal highway safety agency announced Friday it has opened an investigation of accidents involving automated driving systems in cars made by Tesla.
When announcing the probe, the National Highway Traffic Safety Administration revealed that since January 2018, Tesla models engaged in either Autopilot or Traffic Aware Cruise Control were connected to 11 accidents with vehicles at locations involving first responders.
The NHTSA noted that most of incidents took place after dark and the crash sites included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board and road cones.
The probe, to be conducted by the NHTSA’s Office of Defects Investigation, will evaluate the Autopilot systems in Tesla Models Y, X, S, and 3, for the years 2014 through 2021.
Technologies and methods used to monitor, assist, and enforce a driver’s engagement with driving during Autopilot operation will be under scrutiny by the investigators, the NHTSA explained.
They will also assess Autopilot’s object and event detection and response while engaged, and include an examination of the contributing circumstances for the confirmed crashes.
Stock Drop
Tesla stock took a hit Monday on news of the probe, dropping 4.32 percent to US$686.17 a share, but there could be greater ramifications for the company and for automated vehicles in general.
“The probe introduces some uncertainly and doubt around Tesla’s transparency and technology roadmap,” said Roger C. Lanctot, director for automotive connected mobility at Strategy Analytics, a global research, advisory and analytics firm.
“Tesla has failed to meet many of its own deadlines,” he told TechNewsWorld. “That raises questions as to whether the system is functioning to its own satisfaction — which it does not appear to be doing, especially given the many ‘use at your own risk’ messages in its own user guides.”
The NHTSA investigation could lead to Tesla paying fines and being forced to stop marketing Autopilot as autonomous driving technology, maintained Rob Enderle, president and principal analyst at Enderle Group, an advisory services firm in Bend, Ore.
“Tesla seems to be marketing their Level 2 system as a Level 4, and most accidents seem to be because drivers think the systems are more capable than they are,” he told TechNewsWorld.
The Society of Automotive Engineers have defined six levels of automation in a motor vehicle. Levels zero through two require a driver to supervise any automated technology in the vehicle at all times. Levels three through five allows the automated technology to control the vehicle without human intervention. No vehicles sold in the United States currently have a level three through five system.
“The NHTSA has taken a largely hands-off approach to Tesla so far, which has not sat well with the National Transportation Safety Board, which doesn’t have the enforcement powers of NHTSA,” Enderle continued.
“Tesla has gotten away with far more than I’d expect until now,” he observed. “Still, I think the NTSB — which was investigating earlier and appears to be partially behind this NHTSA investigation — is losing patience.”
Basis for New Standards?
“The investigation is probably not a great thing for Elon Musk and Tesla shareholders, but it’s a good thing for the public and Tesla owners,” contended Sam Abuelsamid, principal analyst for e-mobility at Guidehouse Insights, a market intelligence company in Detroit.
“Hopefully, the results of this will be a better understanding of why these vehicles keep crashing into emergency vehicles,” he told TechNewsWorld.
“Maybe we’ll also get some clarity from NHTSA about setting boundaries for driver assistance technology and get some performance standards,” he added.
Abuelsamid explained that there are performance standards for many other systems in vehicles, such as air bags and seat belts, but there aren’t any standards for driver assistance systems, which are supposed to make vehicles safer.
“This investigation and the NHTSA order in June to require reporting of crashes involving a partial- or fully-automated driving system will provide more data for judging the effectiveness of these systems and help set up parameters about what is an effective system,” he said.
“This is such an important issue that it calls for a public/private partnership,” said J. Gerry Purdy, principal analyst with Mobilocity, a mobile advisory firm in Boca Raton, Fla.
“The government needs to set standards or goals,” he told TechNewsWorld. “The companies need to reach the goals. That would require a suite of tests that can be performed anywhere to produce standard reports on how automated systems behave. The NHTSA investigation can help do that.”
Establishing such standards could be welcomed by the industry. “Most people would like some kind of benchmark to put themselves up against,” Abuelsamid explained.
“Early in the crash testing program, the industry wasn’t too thrilled about it,” he observed. “Over time, the industry realized that by having independent bodies evaluate their systems, it gave them something by which they could sell their vehicles. “
“Tesla won’t like it because Tesla doesn’t like any rules,” he added. “They’ve built their reputation on Autopilot being the beginning of full self-driving and these vehicles will eventually be able to operate as robo-taxis. That’s utter nonsense. Those vehicles will never be able to function as robo-taxis.”
Disputing the Hype
As more information comes to light through the NHTSA probe, it could help tamp down the hype surrounding autonomous vehicles.
“I believe the objective of the investigation is to increase transparency and understanding among regulators regarding how the system is meant to function versus how it is functioning and the degree to which it is actually reliant upon an observant driver,” Lanctot said.
“It merely highlights the gap between the hype and the reality of autonomous vehicle technology deployment on mass produced vehicles,” he continued. “Even the most basic semi-autonomous operation still requires human vigilance. Full autonomous is five to 10 years away.”
Enderle added that Tesla’s problems with Autopilot have begun to raise concerns about autonomous driving in general.
“Most drivers are already skeptical about the technology, and Tesla’s issues are making them more so,” he said.
“The continuing failures of Tesla’s Autopilot points to the limitations of combined radar/camera systems generally and revealed the shortcomings of both technologies for solving autonomy on the cheap,” Lanctot added.
Quality Control Questioned
Lanctot also questioned Tesla’s quality control of its automated systems.
“Given the use-at-your-own-risk guidance from Tesla — something no other auto maker would ever use in describing a safety system — it is safe to say that Tesla is not taking adequate steps to ensure the safety of its systems,” he said.
“Each new crash elicits a response from Tesla’s HQ of ‘Oops, we did not anticipate that circumstance,’” he continued. “Such a statement is anathema to current automotive safety system development criteria.”
“Auto makers design their systems in a process designed to anticipate all potential circumstances,” he added.
Tesla’s attempt to combine high tech culture with automotive safety can be dangerous, Abuelsamid maintained.
“The concept of move fast and break things is fundamentally flawed when it comes to the safety of critical systems,” he said.
“It’s fine if you’re building a social network or photo sharing app where the consequences of failure are trivial — if your photo sharing app crashes, it isn’t going to kill anybody. But if your driver assist system ignores a fire truck stopped in front of you, it’s going to kill somebody.
“That approach to building driving systems is reckless and irresponsible,” he asserted.
[ad_2]
Source link