Tesla and Google Face Regulator Scrutiny After Self-Driving Cars Crash

Google's self-driving car project has appointed its first general counsel after a number of crashes involving the company's vehicles caught the attention of regulators (via Reuters).

The National Highway Traffic Safety Administration (NHTSA) said it was collecting information after a minor incident in March when a Google self-driving car struck a municipal bus in California. On that occasion, it did not open a formal probe.

google_self_driving_car
Tesla however is feeling more intense pressure after one of its own cars was implicated in a fatal road accident recently. The NHTSA has opened a formal investigation into the May 7 death of a Tesla Motors Model S driver in Florida who was operating in "Autopilot" mode when his car crashed into a semi-trailer.

Tesla's Autopilot system uses cameras and radar, but not lidar – a special sensor that uses laser to more accurately identify environmental obstacles. The company said its system would have had trouble distinguishing a white semi-trailer positioned across a road against a bright sky.

Reuters reports that the United States Securities and Exchange Commission (SEC) is also looking into whether Tesla breached securities laws by not telling investors about the fatal May 7 Autopilot crash.

The SEC investigation aims to determine whether the accident should have been labeled a "material event" by Tesla, or one that investors are likely to consider important, when the company sold $2 billion in stock on May 18.

In a blog post written in response to a Fortune article on the subject, Tesla explained that all it knew when it notified the NHTSA of the accident was that the driver had died, not that Autopilot was involved. The SEC investigation continues.

Industry executives and analysts told Reuters they expect the Tesla crash will spur investment in self-driving vehicle systems that combine multiple kinds of sensors, including lidar.

Goldman Sachs forecasts the market for advanced driver assistance systems and autonomous vehicles will grow from about $3 billion last year to $96 billion in 2025 and $290 billion in 2035. More than half of that revenue in 20 years will come from radar, cameras and lidar, Goldman estimates.

Meanwhile, U.S. regulators are currently lagging behind in issuing written regulations for autonomous vehicles. Regulations were meant to be unveiled by July 14, but U.S. Transportation Secretary Anthony Foxx announced last month they might not be released until later this summer.

Apple has met with California DMV officials regarding self-driving car laws within the state and multiple reports from The Wall Street Journal indicate that the Cupertino company is exploring the functionality with the possibility of including it in a later iteration of the much-rumored Apple Car.

The bulk of Apple's car research and development is thought to be taking place in secretive buildings in Sunnyvale, California, where late night "motor noises" have been heard in recent months.

Multiple sources have indicated that the Apple Car could be finalized by 2019 or 2020, but a more precise timeframe remains unclear due to possible internal setbacks and other unforeseen circumstances. Tesla CEO Elon Musk recently called the Apple Car an "open secret," as his company aims to fulfil more than 325,000 pre-orders for its lower-priced Model 3 by late 2017.

Related Roundup: Apple Car
Tags: Google, Tesla


Top Rated Comments

(View all)
Avatar
41 months ago
Yep. There's nothing like relying on good old human senses. Those only screw up enough to cause 6,400,000 accidents and kill 30,000 people a year with mature technology. Three accidents and a single fatality by early versions of 2 totally different systems definitely closes the book on all this technology.

Back to horses and buggies everyone!
Rating: 25 Votes
Avatar
41 months ago

Exactly the kind of stuff I was expecting to happen from day one with this idiotic technology. It was especially inevitable after the various references of "fast tracking" and "waivers" for standard regulation parameters being granted to these companies.

I hope this self-driving BS dies on the vine faster than google glass. We do not have artificial intelligence. Even the most irresponsible human operator with a license has more capability in terms of vision and judgment than any computer system possible today.


I have absolutely no doubt that the next decade will prove this statement to be inaccurate.
Rating: 10 Votes
Avatar
41 months ago
I can just imagine what it sounded like when the first people tried to make a fuel engine which blew up in their faces and people said WE JUST CANT DO IT...

Or the first planes which dropped out of the sky...

Wow what is up with you guys
Rating: 6 Votes
Avatar
41 months ago
People said the same thing about computers playing chess, and now they are virtually unbeatable.

In the future (within 15 years), I guarantee self driving cars will be safer than human driven cars. Thinking otherwise really shows a lack of understanding of neutral networking AI. These algorithms use big data to learn and are proven to be able exceed human capacity eventually.

As the tech leaders have said, the next decade will be explosive in terms of AI.
Rating: 5 Votes
Avatar
41 months ago
Google said early on that they learned people learn to trust the self driving features far too much, and that you can't have a car drive itself perfectly for weeks/months/years and expect the driver to be fully alert and ready to take over at any time.

For that reason, they said they do not consider the technology ready until the human driver does not ever need to take over.

I have always thought that Tesla rolling out their Autopilot features was dangerous for exactly the reasons Google said. It works well enough that a human driver will not be prepared to take over if necessary.

Edit:

After doing some research to try and find the blog where Google says they don't believe Level 3 autonomy (Level 3 means the human driver has to be ready to take over) is safe. Every other major car company apparently has come to the same conclusion, Level 3 gives a false sense of safety and it isn't realistic to expect a human driver to be attentive enough to take over in an emergency.

Tesla claims Auto-pilot is Level 2, but most others think it falls under Level 3.

http://www.theverge.com/2016/4/27/11518826/volvo-tesla-autopilot-autonomous-self-driving-car
Rating: 5 Votes
Avatar
41 months ago
Having auto-pilot in cars is tough. It can only get better it more people use it, but I know that I definitely don't want to be using it in one of its earliest implementations.

"It's a gen 1 product, of course there's issues. By gen 3 or 4 it will be amazing" has often been said here. But that kind of logic doesn't make me comfortable when it comes to autononomous vehicles.
Rating: 4 Votes
Avatar
41 months ago

I'm a big fan of autonomous driving and an even bigger fan of the Tesla electric car project so it was greatly disappointing and saddening to hear about the first fatality and the unusual aspects of why the system failed.


Isn't the Tesla system just assisted driving and not self driving? I've read that the person is still doing the driving and it's supposed to help with things like staying in a lane, crash avoidance, etc. Sounds like the biggest issue is calling it "autopilot" which sounds like self driving but isn't.
Rating: 4 Votes
Avatar
41 months ago
Exactly the kind of stuff I was expecting to happen from day one with this idiotic technology. It was especially inevitable after the various references of "fast tracking" and "waivers" for standard regulation parameters being granted to these companies.

I hope this self-driving BS dies on the vine faster than google glass. We do not have artificial intelligence. Even the most irresponsible human operator with a license has more capability in terms of vision and judgment than any computer system possible today.
Rating: 4 Votes
Avatar
41 months ago

I'm a big fan of autonomous driving and an even bigger fan of the Tesla electric car project so it was greatly disappointing and saddening to hear about the first fatality and the unusual aspects of why the system failed.

Tesla will need to ensure safety is a #1 priority and I'm certain Elon Musk and team are on to this.

I still wait in great anticipation of receiving the Model 3.


IMO the "system" that failed wasn't Tesla's, it was the one that allowed the highway-entering truck that cut him off to drive without under-ride guards. They're required in most other first-world countries, and would likely have saved this person's life (as well as about 250 others' this year). But sure, let's investigate the victim instead.
Rating: 3 Votes
Avatar
41 months ago
I'm a big fan of autonomous driving and an even bigger fan of the Tesla electric car project so it was greatly disappointing and saddening to hear about the first fatality and the unusual aspects of why the system failed.

Tesla will need to ensure safety is a #1 priority and I'm certain Elon Musk and team are on to this.

I still wait in great anticipation of receiving the Model 3.
Rating: 2 Votes
[ Read All Comments ]