What's keeping driverless cars off the road? Human drivers bending the rules

Traffic customs differ from area to area, and humans adjust relatively quickly to local idiosyncrasies. But autonomous vehicles are not so adaptable.

|
Eric Risberg/AP/File
An Uber driverless car waits in traffic during a test drive in San Francisco.

In just a few years, well-mannered self-driving robotaxis will share the roads with reckless, law-breaking human drivers. The prospect is causing migraines for the people developing the robotaxis.

A self-driving car would be programmed to drive at the speed limit. Humans routinely exceed it by 10 to 15 miles per hour – just try entering the New Jersey Turnpike at normal speed. Self-driving cars wouldn’t dare cross a double yellow line; humans do it all the time. And then there are those odd local traffic customs to which humans quickly adapt.

In a few years, well-mannered, self-driving robotaxis will share the roads with reckless, law-breaking human drivers. The prospect is causing migraines for the people developing the robocars and is slowing their development. 

In Los Angeles and other places, for instance, there’s the “California Stop,” where drivers roll through stop signs if no traffic is crossing. In Southwestern Pennsylvania, courteous drivers practice the “Pittsburgh Left,” where it’s customary to let one oncoming car turn left in front of them when a traffic light turns green. The same thing happens in Boston. During rush hours near Ann Arbor, Mich., drivers regularly cross a double-yellow line to queue up for a left-turn onto a freeway.

“There’s an endless list of these cases where we as humans know the context, we know when to bend the rules and when to break the rules,” said Raj Rajkumar, a computer engineering professor at Carnegie Mellon University who leads the school’s autonomous car research.

Although autonomous cars are likely to carry passengers or cargo in limited areas during the next three to five years, experts say it will take many years before robotaxis can coexist with human-piloted vehicles on most side streets, boulevards and freeways. That’s because programmers have to figure out human behavior and local traffic idiosyncrasies. And teaching a car to use that knowledge will require massive amounts of data and big computing power that is prohibitively expensive at the moment.

“Driverless cars are very rule-based, and they don’t understand social graces,” said Missy Cummings, director of Duke University’s Humans and Autonomy Lab.

Driving customs and road conditions are dramatically different across the globe, with narrow, congested lanes in European cities, and anarchy in Beijing’s giant traffic jams. In India’s capital, New Delhi, luxury cars share poorly marked and congested lanes with bicycles, scooters, trucks, and even an occasional cow or elephant.

Then there is the problem of aggressive humans who make dangerous moves such as cutting cars off on freeways or turning left in front of oncoming traffic. In India, for example, even when lanes are marked, drivers swing from lane to lane without hesitation.

Already there have been isolated cases of human drivers pulling into the path of cars such as Teslas, knowing they will stop because they’re equipped with automatic emergency braking.

“It’s hard to program in human stupidity or someone who really tries to game the technology,” says John Hanson, spokesman for Toyota’s autonomous car unit.

Kathy Winter, vice president of automated driving solutions for Intel, is optimistic that the cars will be able to see and think like humans before 2030.

Cars with sensors for driver-assist systems already are gathering data about road signs, lane lines and human driver behavior. Winter hopes auto and tech companies developing autonomous systems and cars will contribute this information to a giant database.

Artificial intelligence developed by Intel and other companies eventually could access the data and make quick decisions similar to humans, Ms. Winter says.

Programmers are optimistic that someday the cars will be able to handle even Beijing’s traffic. But the cost could be high, and it might be a decade or more before Chinese regulators deem self-driving cars reliable enough for widespread public use, said John Zeng of LMC Automotive Consulting.

Intel’s Winter expects fully autonomous cars to collect, process and analyze four terabytes of data in 1-1/2 hours of driving, which is the average amount a person spends in a car each day. That’s equal to storing over 1.2 million photos or 2,000 hours of movies. Such computing power now costs over $100,000 per vehicle, Mr. Zeng said. But that cost could fall as more cars are built.

Someday autonomous cars will have common sense programmed in so they will cross a double-yellow line when warranted or to speed up and find a gap to enter a freeway. Carnegie Mellon has taught its cars to handle the “Pittsburgh Left” by waiting a full second or longer for an intersection to clear before proceeding at a green light. Sensors also track crossing traffic and can figure out if a driver is going to stop for a sign or red light. Eventually there will be vehicle-to-vehicle communication to avoid crashes.

Still, some skeptics say computerized cars will never be able to think exactly like humans.

“You’ll never be able to make up a person’s ability to perceive what’s the right move at the time, I don’t think,” said New Jersey State Police Sgt. Ed Long, who works in the traffic and public safety office.

Allen G. Breed in Raleigh, North Carolina; Joe McDonald in Beijing; Nirmala George in New Delhi; and Michael Liedtke in San Francisco contributed to this report.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to What's keeping driverless cars off the road? Human drivers bending the rules
Read this article in
https://www.csmonitor.com/Technology/2017/0511/What-s-keeping-driverless-cars-off-the-road-Human-drivers-bending-the-rules
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe