From the "d'uh!" newsdesk, more information this week on how drone cars are not as infallible as Google and everyone else would like you to believe.
Specific to Google, turns out their cars can be paralysed by real drivers. The problem is that Google cars drive by the book, which nobody else does. Everyone has their own little quirks, shortcuts and grey areas of the law that they interpret in their own way. For example, Google's car can have real problems at a four-way stop. By-the-book, all vehicles should come to a complete stop, then the first vehicle to arrive should proceed (or the vehicle to the right if there's only two of you arriving at the same time). But in the real world, people inch, roll, drive slowly and generally don't really come to a full 100% stop unless there's a police office watching. This is a problem for Google's cars because they will not proceed into a four-way stop unless the other three cars are 100% stationary. Google reports that one of their cars was stranded at one intersection because of this particular issue.
Similarly, Google's cars are causing accidents because they drive by the book. When they see a pedestrian waiting to cross a pedestrian crossing, they stop, like everyone is supposed to. Problem is that in America, pedestrians are second-class citizens and almost nobody stops until the pedestrian is actually in the road. Meaning that when Google's cars stop as they're supposed to, people rear-end them. Repeatedly. Google's cars are too law-abiding and are causing accidents.
The supporters of drone cars will naturally point out that this wouldn't be a problem if everyone had them, but back here in the real world, 100% coverage for drone cars is decades away, so for the time being, they will be on the road with the rest of us and the manufacturers had better figure out some fuzzy logic pretty quickly.
Not specific to google, but applicable to any drone car that uses LIDAR to see the world (read: all of them), you can now spoof LIDAR returns for the sake of a $60 Raspberry Pi kit. Jonathan Petit, Principal Scientist at Security Innovation, has been able to spoof LIDAR returns for everything from pedestrians to solid walls to other vehicles, meaning he can effectively perform a denial-of-service attack on any self-driving car and render it unable to navigate.
Researcher hacks self driving car sensors.
This is not dissimilar to the tactics used in the military to spoof radar returns on everything from aircraft to warships. Only they do it using a combination of slab-sided design and countermeasure electronics. But the principle is the same. Make the source LIDAR or RADAR system see something that isn't really there.
You might think this isn't a risk in the real world, but we live in an age where shining laser pointers at commercial airline pilots is a thing, so it seems logical to assume that $60 LIDAR spoofing kits will become a must-have in the toolbox of the same people.