The rolling disaster called the “Google Self-Driving Car” is another way for Google to spy on you and war profiteer in Afghanistan.

2 months ago The Investigators 0

Google’s owners own the lithium ion battery war profiteering mines in Afghanistan. Their little car is another way to exploit the deadly lithium ion batteries.

Google’s business model is “spying on citizens”. What better way to spy on people than to lock them in a rolling box where all they have to do is stare at a Google screen while sensors record their every private action.

Google’s Self-Driving Car Hit Another Vehicle

A Google self-driving car was pulled over by a curious officer in November.

Google via Google Plus

A Google self-driving car was pulled over by a curious officer in November.

Transportation

 

 

 

Since Google’s robot cars have been on the road, they have been involved with 17 different accidents. But in those incidents, Google’s car wasn’t to blame — another car struck Google’s or the test driver behind the autonomous vehicle was at fault.

 

Until earlier this month. On Feb. 14, one of Google’s self-driving Lexus SUVs struck a municipal bus in Mountain View, according to documents filed with the California DMV.

 

According to the report, the Google car was waiting at an intersection to turn right when it encountered several sand bags blocking the lane. When the light turned green, the car moved left to avoid the bags, then struck a public bus coming from behind.

 

Google’s autonomous driving mode was active when the crash occurred (in other incidents, Google’s test drivers had switched on manual mode). The bus was traveling at 15 miles per hour and Google’s car was going two miles per hour. According to the report, the test driver “saw the bus approaching in the left side mirror but believed the bus would stop or slow to allow the Google [autonomous vehicle] to continue.” No one was injured.

 

Google’s self-driving car unit has repeatedly stressed that autonomous vehicles are far safer than human-piloted ones. Getting regulatory approval and consumer acceptance of driverless fleets is the key pillar to the unit’s business strategy.

 

It’s unclear if Google will ascribe the bus accident to an error with its driving system or simply the complexity of traffic. Very few of the thorny insurance and policy answers about how to treat robotic systems have been worked out.

 

We reached out to Google for additional comment. Tomorrow is the first of the month, when Google typically puts out its monthly traffic report detailing each incident involving its cars. Google said there were no accidents registered in December or January.

 

Update: Google released a snippet of its February self-driving car report a day early to address the bus crash. The company described the incident as something that happens “every day” on the road, but noted that Google “clearly bear[s] some responsibility.”

 

Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.

This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements. In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.

 

 

Related posts: