In light of the news this month that London is going to start allowing fully autonomous drone cars to be tested from 2015, I thought it was worth rehashing some of the finer points of why these cars are not "just around the corner".
We'll concentrate on one particular car here - the Google self-driving car - simply because they're further along than anyone else.
The biggest single issue facing drone cars is really simple : they don't have quantum artificial intelligence. They can't deal with unpredictable situations, so they rely on meticulously collected information from (human-driven) scanner cars that analyze the route ahead of time. Think Google Street View, but a thousand times more detailed, mapped with cameras that photograph every sign, and laser scanners (LIDAR) that maps every bump and crevasse. So every time you see a video of a self-driving google car, you need to understand that thousands of intricate preparations have been made beforehand, with the car's exact route extensively mapped. Data from multiple passes by a special sensor vehicle must later be pored over, meter by meter, by both computers and humans. And because a laughably tiny number of roads in the U.S. have been analysed at this level to allow autonomous car use, essentially if you take a google car off-campus, it's a very expensive paperweight.
Tied to the unpredictability part of that are three other factors - weather, construction, and humans. Drone cars are currently so stupid that they can't even drive in rain, let alone ice or snow. Google have so much trouble with rain in particular that they haven't even reached the testing stage yet. The reason is simple : to a human, rain makes things look wet. To a computer synthetic vision system, they might as well be driving on Mars.
Remember I said how much work goes into mapping a road for a Google car to drive on? Go and park a utility truck in the inside lane and put some cones out. Voila. One incapacitated self-driving car. Because the truck and cones weren't there when the road was mapped, the car can't handle it. The same goes for potholes and open manhole covers - the car will just drive straight over (or into) them. Some changes can be handled, obviously. Google's cars look for things like stop signs - even in unexpected places - and react accordingly. At least that's the theory. They also look for pedestrians and other traffic all the time, so missing one stop sign shouldn't be a problem right?
If we welcome pesky humans into the equation, things change radically. As Google says, "Pedestrians are detected simply as moving, column-shaped blurs of pixels, meaning that the car wouldn't be able to spot a police officer at the side of the road frantically waving for traffic to stop". Similarly it has trouble with pedestrians who don't use crosswalks (ie. all of them) and people standing in the road where they shouldn't be.
So whilst headline-grabbing mayors and confused mainstream newspaper editors may have everyone believing they'll be able to go out and buy a self-driving drone this Christmas, it simply isn't the case and won't be for probably another 10 years.
1 comment:
I am very amazed by the information of this blog and I am so glad that I had a look over this blog. Thank you so much for sharing such great information. A Driving School
Post a Comment