NASA thinks Tesla Autopilot is a bad idea

NASA has been studying the pyschological effects of automation for decades, and thus may have something to teach Tesla.

|
Beck Diefenbach/Reuters/File
The Tesla Model S version 7.0 software update containing Autopilot features is demonstrated during a Tesla event in Palo Alto, Calif.

Since news broke last month that the driver of a Tesla Model S running on the Autopilot driver-assistance system died in a crash, the technology has inspired intense debate.

Regulators are investigating the crash to determine whether the fault was with the human driver, Autopilot itself—or a combination of both.

Now, one agency with a significant amount of relevant experience has weighed in.

NASA has been studying the pyschological effects of automation for decades, and thus may have something to teach Tesla, notes Scientific American.

"News flash: cars in 2017 equal airplanes in 1983," Stephen Casner—a research psychologist at NASA's Human Systems Integration Division—told the magazine.

For the public at large, the name "Autopilot" seems to imply a similarity to the automated systems that help fly planes, although the capabilities of the Tesla system are much more limited.

Tesla Autopilot is more akin to the bundles of driver-assistance features offered by other carmakers than to a true semi-autonomous system.

Even if Autopilot had greater capability, NASA's Casner highlighted a crucial difference between the operation of cars and airplanes that makes its use much riskier.

An autopilot system temporarily takes the human operator out of the loop of control, and the transition back to human control cannot happen instantaneously.

But because airplanes fly several miles up in the sky, pilots typically have a minute or more to transition from autopilot back to manual control.

That's not the case with cars, where drivers may have 1 second or less to react to an emergency situation.

Humans also have trouble paying attention when automated systems are running, NASA has found.

It is difficult for humans to monitor repetitive processes for a long time, a phenomenon known as the "vigilance decrement."

In other words, the more competent an automated system, the more likely the driver is to zone out.

That may be even more likely in automated cars, as the technology is often pitched to consumers—explicitly or implicitly—as a convenience to let them use time normally spent concentrating on the road for other tasks.

Tesla seems determined to use Autopilot as the foundation for fully-autonomous cars, but a wide gulf remains today between those two technologies.

Automated systems that operate in predetermined ways under specific circumstances are not the same as autonomous systems that mimic the flexibility and decision-making processes of human operators.

Since launching Autopilot, Tesla has said that the system is in the "public beta" stage, implying that it is still undergoing testing and development.

That message has been reiterated in the weeks following the fatal crash.

But if NASA's experience with automation is any indication, it may not be advisable to roll out similar technology for cars gradually.

If human nature can't always handle the ambiguity of a partially-automated car, it becomes more difficult to judge the grounds on which to evaluate risks and rewards of the piecemeal approach used by Tesla and others.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to NASA thinks Tesla Autopilot is a bad idea
Read this article in
https://www.csmonitor.com/Business/In-Gear/2016/0827/NASA-thinks-Tesla-Autopilot-is-a-bad-idea
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe