Facebook's secret experiment on users had a touch of 'Inception'

Facebook secretly tweaked some users' news feeds as part of an experiment on 'emotional contagion.' The results were fascinating, but raise questions about online ethics.

|
Dado Ruvic/Reuters/File
A man is silhouetted against a video screen with a Facebook logo as he poses with a laptop in this photo illustration taken August 14, 2013.

In the popular 2010 sci-fi movie “Inception,” a crew of tech-savvy geniuses sneak into a person’s subconscious dreamscape, “incept” an action-inducing emotion, and then watch as that person makes a wide-awake “free choice” based on the power of that surreptitiously implanted suggestion.

Like fictional tales from “Frankenstein” to “The Matrix,” the film tweaked the kind of cultural fears often evoked by the thrilling power of human technology, and the creepy unintended consequences it might possibly produce.

So when Facebook revealed last week that it had conducted a secret experiment with its user’s emotions, gauging the “emotional contagion” of its algorithmic, personally-tailored news feed, a chorus a critics cried foul, saying the social media behemoth was messing with people’s minds without their knowledge.

For a week in 2012, a trio of scientists were allowed to tinker with the Facebook algorithms of nearly 700,000 users, a new study has revealed, and they measured whether a mostly positive or mostly negative news feed could then influence the emotional tenor of the users’ own posts.

The data suggested that it did. “We show, via a massive ... experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” wrote the study’s authors, scientists from Cornell University, the University of California at San Francisco, and Facebook’s own Core Data Science Team.

It’s another digital age jolt to the modern psyche, sparking a host of questions about privacy and the ethics of online behavior.

As news of the study broke, the Twitterverse was awash with keywords such as “super disturbing,” “creepy,” and “evil” to describe the experiment on unsuspecting users. Many also began to discuss the ethics of even conducting such an experiment, asking: Just how did such research institutions approve this secret “manipulation” of user emotions?

But at an even deeper level, ethicists have begun to wonder of about the moral landscape of the ever-ubiquitous online world.

The concept of “emotional contagion,” with its virus-like connotations, has been studied for decades by scientists, who observe how the emotions and behaviors of human beings naturally, and even unconsciously, get in sync with those around them. It’s an important natural process for any social creature, behavioral theorists say.

“What I think is startling about this, is that it calls into question our individualist assumptions about ethics,” says Evan Selinger, a fellow at the Institute for Ethics and Emerging Technologies in Hartford, Conn. “So there may no intention on our part to influence other people, but what this kind of behavior research has shown, that we’re doing so all the time.”

To be clear, Facebook didn’t put anything extra into users’ news feed. The researchers simply reduced “negative” headlines from one sample of users – pulling from 10 to 90 percent of headlines with keywords they deemed a bummer – while similarly reducing “positive” headlines with keywords they deemed more uplifting from another sample.

“We provide experimental evidence that emotional contagion occurs without direct interaction between people,” the study concluded, showing that users in each sample began posting more positive or negative posts respectively, based on the content of their news feed.

“So if you start posting in a certain way, does that affect how other people behave?” says Dr. Selinger, a professor of philosophy at the Rochester Institute of Technology in Rochester, N.Y. “That to me is an interesting ethical question, sort of my brother’s keeper sort of thing.”

And it also speaks to the emotion- and behavior-shaping power the Facebook news feed may wield. It is viewed by some 130 million Americans who sign on to Facebook each day.

The presence of happy, smiling faces is vital in creating the positive vibes that might convince a customer to buy, and then to come back to buy again, as any retailer knows.

But this becomes a powerful social force in the online era, critics contend. Of course, every social media user agrees to give up a measure of privacy to participate in the digital age. Tech companies’ algorithms already monitor our e-mails, browsing habits, and even physical movements as they construct digital profiles for targeted marketing efforts.

But should there be any moral or ethical constraints on the decisions that fill a news feed?

Earlier this month, before the study came out, Bianca Bosker, senior tech editor at The Huffington Post, asked a similar moral question in her post, “Should Online Ads Really Offer Binge Drinkers A Booze Discount?”

“Will online advertisements adopt a moral code?” she wrote. “As they get more insight into our peccadillos, weak spots, indulgences and addictions, should the Facebooks and Googles of the world limit marketers from pushing products that make us behave badly or cause harm? And who’d decide what 'bad' looks like?”

Given this power to shape emotion and behavior, society has to come to grips with their moral implications, ethicists say.

“Everybody wants to tap into that power, to move collective action in certain ways,” says Selinger. “I think our theories of collective action, and our ethics of collective action, might not have caught up with some of these behavior insights.”

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Facebook's secret experiment on users had a touch of 'Inception'
Read this article in
https://www.csmonitor.com/USA/Society/2014/0630/Facebook-s-secret-experiment-on-users-had-a-touch-of-Inception
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe