Google's self-driving car project has appointed its first general counsel after a number of crashes involving the company's vehicles caught the attention of regulators (via Reuters).
The National Highway Traffic Safety Administration (NHTSA) said it was collecting information after a minor incident in March when a Google self-driving car struck a municipal bus in California. On that occasion, it did not open a formal probe.
Tesla however is feeling more intense pressure after one of its own cars was implicated in a fatal road accident recently. The NHTSA has opened a formal investigation into the May 7 death of a Tesla Motors Model S driver in Florida who was operating in "Autopilot" mode when his car crashed into a semi-trailer.
Tesla's Autopilot system uses cameras and radar, but not lidar – a special sensor that uses laser to more accurately identify environmental obstacles. The company said its system would have had trouble distinguishing a white semi-trailer positioned across a road against a bright sky.
Reuters reports that the United States Securities and Exchange Commission (SEC) is also looking into whether Tesla breached securities laws by not telling investors about the fatal May 7 Autopilot crash.
The SEC investigation aims to determine whether the accident should have been labeled a "material event" by Tesla, or one that investors are likely to consider important, when the company sold $2 billion in stock on May 18.
In a blog post written in response to a Fortune article on the subject, Tesla explained that all it knew when it notified the NHTSA of the accident was that the driver had died, not that Autopilot was involved. The SEC investigation continues.
Industry executives and analysts told Reuters they expect the Tesla crash will spur investment in self-driving vehicle systems that combine multiple kinds of sensors, including lidar.
Goldman Sachs forecasts the market for advanced driver assistance systems and autonomous vehicles will grow from about $3 billion last year to $96 billion in 2025 and $290 billion in 2035. More than half of that revenue in 20 years will come from radar, cameras and lidar, Goldman estimates.
Meanwhile, U.S. regulators are currently lagging behind in issuing written regulations for autonomous vehicles. Regulations were meant to be unveiled by July 14, but U.S. Transportation Secretary Anthony Foxx announced last month they might not be released until later this summer.
Apple has met with California DMV officials regarding self-driving car laws within the state and multiple reports from The Wall Street Journal indicate that the Cupertino company is exploring the functionality with the possibility of including it in a later iteration of the much-rumored Apple Car.
The bulk of Apple's car research and development is thought to be taking place in secretive buildings in Sunnyvale, California, where late night "motor noises" have been heard in recent months.
Multiple sources have indicated that the Apple Car could be finalized by 2019 or 2020, but a more precise timeframe remains unclear due to possible internal setbacks and other unforeseen circumstances. Tesla CEO Elon Musk recently called the Apple Car an "open secret," as his company aims to fulfil more than 325,000 pre-orders for its lower-priced Model 3 by late 2017.
Top Rated Comments
Back to horses and buggies everyone!
Or the first planes which dropped out of the sky...
Wow what is up with you guys
For that reason, they said they do not consider the technology ready until the human driver does not ever need to take over.
I have always thought that Tesla rolling out their Autopilot features was dangerous for exactly the reasons Google said. It works well enough that a human driver will not be prepared to take over if necessary.
Edit:
After doing some research to try and find the blog where Google says they don't believe Level 3 autonomy (Level 3 means the human driver has to be ready to take over) is safe. Every other major car company apparently has come to the same conclusion, Level 3 gives a false sense of safety and it isn't realistic to expect a human driver to be attentive enough to take over in an emergency.
Tesla claims Auto-pilot is Level 2, but most others think it falls under Level 3.
http://www.theverge.com/2016/4/27/11518826/volvo-tesla-autopilot-autonomous-self-driving-car
In the future (within 15 years), I guarantee self driving cars will be safer than human driven cars. Thinking otherwise really shows a lack of understanding of neutral networking AI. These algorithms use big data to learn and are proven to be able exceed human capacity eventually.
As the tech leaders have said, the next decade will be explosive in terms of AI.
I hope this self-driving BS dies on the vine faster than google glass. We do not have artificial intelligence. Even the most irresponsible human operator with a license has more capability in terms of vision and judgment than any computer system possible today.