Monday, August 16, 2021

US investigates Autopilot after 11 Teslas crashed into emergency vehicles

Regulator worried Autopilot can't spot parked vehicles or keep driver engaged.



TIM DE CHANT - 8/16/2021, 

Enlarge / A 2014 Tesla Model S driving on Autopilot rear-ended a Culver City fire truck that was parked in the high-occupancy vehicle lane on Interstate 405.

Culver City Firefighters Local 1927 / Facebook

US government regulators are opening an investigation into Tesla’s Autopilot system after cars using the feature crashed into stopped emergency vehicles.

The National Highway Transportation Safety Administration announced the investigation today, and it encompasses 765,000 Teslas sold in the US, a significant fraction of all of the company’s sales in the country. The agency says the probe will cover 11 crashes since 2018; the crashes caused 17 injuries and one death.

The NHTSA is looking at Tesla’s entire lineup, including Models S, X, 3, and Y from model years 2014–2021. It’s investigating both Autopilot and Traffic Aware Cruise Control, a subset of Autopilot that does not steer the vehicle but allows it to match traffic speeds.

In each of the 11 crashes, Teslas have hit first responders’ vehicles that have been parked and marked with flashing lights, flares, illuminated arrow boards, or road cones.

The investigation will cover the entire scope of the Autopilot system, including how it monitors and enforces driver attentiveness and engagement, as well as how the system detects and responds to objects and events in or near the roadway.

Driver attention questioned

Tesla has faced scrutiny for the way Autopilot verifies drivers’ attentiveness while the system is turned on. In an assessment of advanced driver-assistance systems (ADAS), Autopilot received middling marks in the European New Car Assessment Program. The system was hampered by its relative inability to keep drivers engaged with the road.

FURTHER READING
Federal investigators blast Tesla, call for stricter safety standards

Like many other ADAS systems, Autopilot requires a driver to keep their hands on the wheel, though such systems can be easily fooled by draping a weight over one of the steering wheel’s spokes. A recent investigation by Car and Driver found that it took anywhere between 25 to 40 seconds for the vehicle to flash a warning when drivers took their hands off the wheel, depending on the model. If drivers didn’t respond, the car would drive for another 30 seconds before starting to brake. At highway speeds, this could result in the system operating without driver engagement for up to a mile.

In the wake of a January 2018 crash in California, the National Transportation Safety Board criticized the way that Tesla attempts to keep drivers engaged. In that incident, which is also part of the NHTSA probe, a 2014 Model S rear-ended a fire truck in the high-occupancy vehicle (HOV) lane of Interstate 405 in Culver City. The Tesla’s driver had Autopilot engaged and was following another vehicle in the HOV lane when the lead vehicle changed lanes to avoid the parked fire truck. Autopilot did not swerve or brake, and the driver, who was eating a bagel, did not take control of the vehicle. The Tesla hit the fire truck at 31 mph, according to the accident report.

The National Transportation Safety Board said that driver’s inattentiveness was the likely cause of the crash “due to inattention and overreliance on the vehicle’s advanced driver assistance system; the Tesla Autopilot design, which permitted the driver to disengage from the driving task; and the driver’s use of the system in ways inconsistent with guidance and warnings from the manufacturer.”

Autopilot changes


Tesla recently began changing the way Autopilot works, ditching the radar sensor in Models 3 and Y in favor of additional cameras. (Models S and X will retain radar for the foreseeable future.) As the crashes that are part of the NHTSA probe show, radar data doesn’t guarantee that ADAS systems will properly sense obstacles in the roadway, though generally, additional sensors can help the systems get a complete picture of the scene. Because radar and lidar data are essentially a series of measurements, they aid in determining how far a vehicle is from an object. While ADAS systems can get the same information from camera images, they require more complicated computations than with radar or lidar. It’s unclear whether the NHTSA investigation includes Tesla’s new camera-only models.

Nor is it clear whether the probe will affect Tesla’s so-called Full Self-Driving feature, beta versions of which have been released to a group of drivers. Videos of the system in action show that it’s very much a work in progress, and it needs driver attention at all times.

FURTHER READING
Tesla: “Full self-driving beta” isn’t designed for full self-driving

While Full Self-Driving does make some decisions that closely emulate a human driver, in other cases, it makes more questionable choices. In one video, a Full Self-Driving car brakes only after passing a disabled vehicle on the shoulder. On the same trip, it suddenly swerves right into another lane before taking a left. In another video, the car creeps forward into intersections despite cross traffic, and later, it almost tries to drive into a hole in the street that was surrounded by construction cones. At times, Full Self-Driving can't tell whether the human driver has control of the vehicle, and it will drive for more than a minute between prompts to confirm driver attention.

So far, automakers have been largely free to develop ADAS features without significant regulatory oversight. The NHTSA has been relatively hands-off, to the point that the NTSB has been critical of its laissez-faire attitude. This new investigation suggests the agency may be considering a less lenient approach.

US to investigate Tesla's Autopilot following emergency vehicle crashes

Cyrus Chan
The US National Highway Traffic Safety Administration will investigate into Tesla driver assistance system Autopilot after repeated crashes with emergency vehicles. (Photo by Justin Sullivan/Getty Images)

The US National Highway Traffic Safety Administration (NHTSA) said today it plans to launch a probe into Tesla’s driver assistance system Autopilot after repeated crashes with emergency vehicles.

The investigation will cover Tesla Models Y, X, S, and 3 vehicles released between 2014 and 2021, estimating 765,000 Tesla vehicles in the US to be involved in the probe.

The NHTSA will assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation.

The auto safety agency could opt to take no action or upgrade a preliminary investigation into an engineering analysis before ordering for a recall. The two-step investigation would take a year or more.

Tesla shares slipped as much as 5 per cent after the investigation was announced.


Also Read:
Elon Musk sets sights on October opening for new Tesla Berlin factory

Autopilot handles some driving tasks and allows drivers to keep their hands off the wheel for extended periods in Tesla vehicles.

NHTSA said since January 2018 had identified 11 crashes in which Tesla models “have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.”

Most of the reported crashes took place after dark, and the crash scenes encountered included measures like emergency vehicle lights, flares or road cones.

It reported 17 injuries and one death in those crashes.

The US National Transportation Safety Board criticised Tesla’s lack of system safeguards for Autopilot and NHTSA’s failure to ensure the safety of Autopilot.

Tesla’s chief executive Elon Musk has repeatedly defended Autopilot, saying, “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle” in a tweet this April.

More Tesla troubles: U.S. safety agency investigating Autopilot problems on 765,000 vehicles

The investigation covers 765,000 vehicles — nearly everything Tesla has sold in the United States since the start of the 2014 model year.
Aug 16, 2021
Tom Krisher
Tesla 2018 Model 3 sedans on display outside a Tesla showroom in Littleton, Colo. AP

The U.S. government has opened a formal investigation into Tesla’s Autopilot partially automated driving system after a series of collisions with parked emergency vehicles.

The investigation covers 765,000 vehicles — nearly everything Tesla has sold in the United States since the start of the 2014 model year.

Of the crashes identified by the National Highway Traffic Safety Administration as part of the probe, 17 people were injured and one was killed.

NHTSA says it has identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes where first responders have used flashing lights, flares, an illuminated arrow board or cones warning of hazards. The agency announced the action Monday.

RELATED
Trouble with Tesla: Couple were sold a damaged car, then told they can’t sue

The probe is another sign that NHTSA under President Joe Biden is taking a tougher stance on automated vehicle safety than under previous administrations. Previously the agency was reluctant to regulate the new technology for fear of hampering adoption of the potentially life-saving systems.

The investigation covers Tesla’s entire current model lineup, the Models Y, X, S and 3 from the 2014 through 2021 model years.

RELATED
Automated vehicle makers ordered to report crashes in sign of tougher government stance

The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate. The NTSB also recommended that NHTSA require Tesla to have a better system to make sure drivers are paying attention. NHTSA has not taken action on any of the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.

Last year the NTSB blamed Tesla, drivers and lax regulation by NHTSA for two collisions in which Teslas crashed beneath crossing tractor-trailers. The NTSB took the unusual step of accusing NHTSA of contributing to the crash for failing to make sure automakers put safeguards in place to limit use of electronic driving systems.

The agency made the determinations after investigating a 2019 crash in Delray Beach, Florida, in which the 50-year-old driver of a Tesla Model 3 was killed. The car was driving on Autopilot when neither the driver nor the Autopilot system braked or tried to avoid a tractor-trailer crossing in its path.

Autopilot has frequently been misused by Tesla drivers, who have been caught driving drunk or even riding in the back seat while a car rolled down a California highway.

Tesla, which has disbanded its media relations office, didn’t immediately respond to a message seeking comment.

RELATED
Consumer Reports tricks Tesla to drive with no one at wheel

NHTSA has sent investigative teams to 31 crashes involving partially automated driver assist systems since June of 2016. Such systems can keep a vehicle centered in its lane and a safe distance from vehicles in front of it. Of those crashes, 25 involved Tesla Autopilot in which 10 deaths were reported, according to data released by the agency.

Tesla and other manufacturers warn that drivers using the systems must be ready to intervene at all times. In addition to crossing semis, Teslas using Autopilot have crashed into stopped emergency vehicles and a roadway barrier.

The probe by NHTSA is long overdue, said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University who studies automated vehicles.

Tesla’s failure to effectively monitor drivers to make sure they’re paying attention should be the top priority in the probe, Rajkumar said. Teslas detect pressure on the steering wheel to make sure drivers are engaged, but drivers often fool the system.

“It’s very easy to bypass the steering pressure thing,” Rajkumar said. “It’s been going on since 2014. We have been discussing this for a long time now.”

The crashes into emergency vehicles cited by NHTSA began on Jan. 22, 2018, in Culver City, California, near Los Angeles when a Tesla using Autopilot struck a parked firetruck that was partially in the travel lanes with its lights flashing. Crews were handling another crash at the time.

RELATED
Facing safety questions, carmakers group suggests voluntary guidelines on automated vehicles

Since then, the agency said there were crashes in Laguna Beach, California; Norwalk, Connecticut; Cloverdale, Indiana; West Bridgewater, Massachusetts; Cochise County, Arizona; Charlotte, North Carolina; Montgomery County, Texas; Lansing, Michigan; and Miami, Florida.

“The investigation will assess the technologies and methods used to monitor, assist and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” NHTSA said in its investigation documents.

In addition, the probe will cover object and event detection by the system, as well as where it is allowed to operate. NHTSA says it will examine “contributing circumstances” to the crashes, as well as similar crashes.

An investigation could lead to a recall or other enforcement action by NHTSA.

“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” the agency said. “Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles.”

The agency said it has “robust enforcement tools” to protect the public and investigate potential safety issues, and it will act when it finds evidence “of noncompliance or an unreasonable risk to safety.”

In June NHTSA ordered all automakers to report any crashes involving fully autonomous vehicles or partially automated driver assist systems.

Tesla, based in Palo Alto, California, uses a camera-based system, a lot of computing power, and sometimes radar to spot obstacles, determine what they are, and then decide what the vehicles should do.

But Carnegie Mellon’s Rajkumar said the company’s radar was plagued by “false positive” signals and would stop cars after determining overpasses were obstacles.

Now, Tesla has eliminated radar in favor of cameras and thousands of images that the computer neural network uses to determine if there are objects in the way. The system, he said, does a very good job on most objects that would be seen in the real world. But it has had trouble with parked emergency vehicles and perpendicular trucks in its path.

“It can only find patterns that it has been ‘quote unquote’ trained on,” Rajkumar said. “Clearly the inputs that the neural network was trained on just do not contain enough images. They’re only as good as the inputs and training. Almost by definition, the training will never be good enough.”

Tesla also is allowing selected owners to test what it calls a “full self-driving” system. Rajkumar said that should be investigated as well.

No comments:

Post a Comment