SAN FRANCISCO, July 18, 2016 /PRNewswire-USNewswire/ -- Consumer Watchdog today called on Tesla to require, through a software update, that the driver's hands remain on the steering wheel when Autopilot is engaged, and for the Obama administration to slow its push to deploy self-driving robot car technology. The group also called on Tesla and other carmakers to accept legal responsibility when their self-driving technology causes a crash.
The nonpartisan nonprofit public interest group brought a large white truck to circle outside the Union Square Hilton Hotel where the Automated Vehicles Symposium 2016 is being held this week. Department of Transportation Secretary Anthony Foxx and National Highway Traffic Safety Administration Administrator Mark R. Rosekind are to speak at this week's Automated Vehicles Symposium and are expected to release industry-friendly "guidance" for robot cars, which give the public no opportunity for comment or scrutiny.
The white truck highlighted the shortcomings of Tesla's Autopilot, which could not distinguish between a white truck making a left turn in front of the car and a bright sky in a fatal crash in Florida that killed former Navy SEAL Joshua Brown. Consumer Watchdog's truck had large signs reading "Tesla Don't Hit Me!" and "Obama Speed Kills."
View and download a photo of the truck here: http://www.consumerwatchdog.org/images/teslatruck071816sm.jpg
Consumer Watchdog proposed reprogramming Tesla's Autopilot software to require the driver's hands on the wheel in a letter sent today with former NHTSA Administrator Joan Claybrook to Tesla Chairman Elon Musk and NHTSA Administrator Mark Rosekind. The letter also called for automakers to accept legal responsibility and for NHTSA to abandon its voluntary guidelines and adopt enforceable standards. Read the letter here: http://www.consumerwatchdog.org/resources/ltrmuskrosekind7-18-16_final.pdf
"We are deeply concerned about the failure of Tesla and NHTSA to accept responsibility for the death of Joshua Brown, a death that is not the result of human error, but of the failure of technology and acceptance of a voluntary industry agreement by the government in lieu of minimum mandatory safety performance standards," Court and Claybrook wrote.
"The tragic fatal Tesla crash in Florida demonstrates the need for Federal Motor Vehicle Safety Standards rather than voluntary guidelines. Not only did the Tesla S's video camera fail to distinguish a white truck from white sky, but its automatic emergency braking system failed to apply the brake with a tractor-trailer stretched across the road right in front of it." The letter noted that NHTSA had denied a rulemaking proceeding for safety standards for automatic emergency braking that may have created enforceable standards for radar range that could have prevented the death.
"The failure of Tesla's Autopilot, a system a never reviewed by federal regulators for safety, and of Tesla's AEB are a poster child for why enforceable safety standards are needed, not useless voluntary guidelines," wrote Court and Claybrook.
Court and Claybrook went on to demand that Tesla and other developers of self-driving technology assume legal liability when their self-driving technology causes a crash as Volvo and Mercedes have done.
"If the manufacturers lack the confidence in their products to stand behind them and assume responsibility and liability when the systems they design are in control, and innocent people are injured or killed as a result, those vehicles do not belong on the road," Court said.
Consumer Watchdog said that Tesla has ducked responsibility for recent crashes involving Autopilot, instead blaming the victims. Tesla wants to have it both ways, Consumer Watchdog said, hyping the image of Autopilot as self-sufficient, but walking back any promise of safety by saying drivers must pay attention all the time.
"By releasing Autopilot prematurely in Beta mode, Tesla is unconscionably using our public highways as a test lab and its customers as human guinea pigs," said Court.
Another safety feature, had it been mandated, could have helped mitigate the crash, Court and Claybrook said in the letter. Tractor-trailers are required to have read-end crash underride guards, so a car that rear-ends the trailer won't slide under it with devastating consequences. Safety advocates have long advocated side truck underride guards, but NHTSA has not started a rulemaking to require them.
Court and Claybrook added, "Had NHTSA taken the advice of trucking safety advocates years ago, the tractor trailer that ended Mr. Brown's life would have been equipped with side underride guards that might also have saved his life, as well possibly been detected by AEB radar, triggering the braking system.
"A side underride guard would almost certainly have decreased the damage in the Florida crash," the letter said. "We call on NHTSA to being a rulemaking to enact an FMVSS to require them."
Visit our website at www.consumerwatchdog.org
To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/consumer-watchdog-exposes-robot-car-weaknesses-at-symposium-calls-on-tesla-to-update-software--carmakers-to-accept-legal-responsibility-asks-feds-to-go-slow-300300071.html
SOURCE Consumer Watchdog