Autonomous car in fatal accident

0
1243

The anti-autonomous group are busy saying, “I told you so. That’s the end of Tesla.” However, when you look back at the history of the motor car, there have been millions of deaths from conventional cars and it didn’t stop the development of that group, did it.

Tesla Motors Inc. says the self-driving feature suspected of being involved in a May 7 fatal crash is experimental, yet it’s been installed on all 70,000 of its cars since October 2014.

Tesla S.
Tesla S.

For groups that have lobbied for stronger safety rules, that’s precisely what’s wrong with U.S. regulators’ increasingly anything-goes approach.

“Allowing automakers to do their own testing, with no specific guidelines, means consumers are going to be the guinea pigs in this experiment,” said Jackie Gillan, president for Advocates for Highway and Auto Safety, a longtime Washington consumer lobbyist who has helped shape numerous auto-technology mandates “This is going to happen again and again and again.” (Give me strength!)

The May crash under investigation involved a 40 year old Ohio man who was killed when his 2015 Model S drove under the trailer of an 18-wheeler on a highway near Williston, Florida, according to the Florida Highway Patrol. The truck driver told the Associated Press that he believes the Ohio man may have been watching a movie. Authorities recovered a portable DVD player but don’t know whether it was playing at the time of the crash.

The National Highway Traffic Safety Administration said Thursday that it is investigating the crash, which comes as the regulator says it is looking for ways to collaborate with the industry. The agency negotiated an agreement to speed the introduction of automatic emergency braking earlier this year, frustrating safety groups who say they had no input and said carmakers’ pledges to install the technology couldn’t be enforced by law.

NHTSA is also expected to announce guidelines that will set some parameters for self-driving cars on U.S. roads. Transportation Secretary Anthony Foxx told reporters Wednesday the agency would be as exact as it could without being overly prescriptive.

In January, Foxx and NHTSA chief Mark Rosekind announced in Detroit that they’d allow automakers to demonstrate the safety of autonomous vehicles and apply for exemptions to existing safety rules. They said the government shouldn’t stand in the way of technological progress.

In the Florida crash, Tesla’s “Autopilot” semi-autonomous driving feature failed to detect the white side of the tractor trailer against a brightly lit sky, so it didn’t hit the brakes, according to the company.

The company says the cars are safer than conventional ones. Tesla said the May accident was the first known fatality in more than 130 million miles of Autopilot driving. That compares with one fatality in every 94 million miles among all U.S. vehicles, according to Tesla.

“Autopilot is by far the most advanced driver-assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility,” the company said. “Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”

BMW announced its own self-driving car venture partnering with Intel Corp. and Mobileye, aiming for cars on the road by 2021. Even on the day of the announcement, company executives were cautious about the limits of technology that allows people to drive hands-free.

In February, a Lexus-model Google self-driving car hit the side of a bus near the company’s Silicon Valley headquarters. The vehicle was in autonomous mode going about 2 miles per hour around sandbags in the road. Google’s software detected the bus but predicted that it would yield, which it did not, according to a company report about the incident. There were no injuries reported at the scene, the company said. “In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision,” Google said in its report. (That is as silly as the statement that accidents with farangs wouldn’t have happened if the farang hadn’t come to Thailand. Certainly the software covering autonomous cars is not yet foolproof, but human control is not foolproof either!)