Is Facebook to blame for making us more polarized? No, we are.
Critics have worried that the algorithm Facebook uses to determine what users see could be creating 'bubbles' that allow us to see only what we agree with. A new study finds that users are driving the trend more than Facebook itself.
Karly Domb Sadof/AP/File
A quick Google search for the social-media giant Facebook turns up a range of provocative questions: Is Facebook making us lonely? Is Facebook losing its cool? Is Facebook dying?
Scientists at Facebook have added another: Is Facebook reinforcing ideological bubbles that users build around themselves?
Their short answer is: yes. But the effect is small compared with contributions users themselves make. Users build those bubbles through their choice of "friends," what those friends share, and the extent to which users open links to news or opinion material that would offer views that run counter to the user's view.
On one level, the results, published Thursday in the online journal Science Express, suggest that for now, social media and their complex, user-focused algorithms aren't to blame for the nation's growing political polarization.
That polarization is a trend many political and information scientists see as a threat to a well-oiled democracy, which relies on people with competing ideologies working together toward shared goals. The study reinforces the observation that people are bringing to the virtual world their real-world tendencies to surround themselves with people who think like they do.
On another level, however, the small internal effect the researchers detected from Facebook's algorithm should raise warning flags, says David Lazer, a political scientist at Northeastern University who focuses in part on the impact of the internet on politics and was not a member of the study team.
"There's nothing in the algorithm that says: Let's polarize America," he says. But "the simple rules that might make content more engaging may also result in this kind of bubble."
He notes that Facebook recently tweaked it algorithm, in part to make sure a user sees more material from people a user identifies as close friends.
"Close friends are probably more similar to you in many ways than your distant acquaintances. So it's quite plausible that the change will have the unintended consequence" of further narrowing the range of perspectives that enter a user's news feed, he says.
The new study grew out of surprising results in previous work, which looked at how users got their information on Facebook, says Eytan Bakshy, a data scientist at Facebook and the study's lead author. The earlier study found that on average, the less frequently you interact with a Facebook friend, the more likely you are to share items that come from that friend.
"To our surprise we found that the majority of information that you click on and you end up re-sharing comes from weaker ties," people with whom you interact relatively rarely, Dr. Bakshy says. These people "have the potential to be more dissimilar to you."
That raised a question: What does this imply for the notion of social media as an echo chamber in which people surround themselves only with people who think like they do?
Others have tried to tackle that question, with conflicting results – often in no small part because the sample sizes in the study groups were relatively small.
Bakshy and colleagues tapped data and activity for some 10.1 million Facebook users in the United States, using protocols that ensured their anonymity. These people had listed a political affiliation in their profiles. In addition, the team focused on shared content they dubbed hard news or opinion – politics, US news in general, and international news. No cats or children's birthday parties. Ideology of the source was based on the organization tied to a web link, rather than the content of specific articles.
When the researchers parsed the data, they found that on average, 23 percent of a user's friends are people whose politics are "from the other side." Despite the heavy tilt in friends toward "like me," just under 30 percent of the incoming news represented the other side's perspective – so-called cross-cutting material.
Overall, the algorithm organizing what a user is most likely to see reduces cross-cutting content by slightly less than 1 percent, while a user's self-built bubble reduces that content by about 4 percent.
Given the relatively small influence of the algorithm, the results "are not all that different from a lot of what we know about how people are acting across ideological and party lines in the real world," says Patrick Miller, a political scientist at the University of Kansas at Lawrence who also studies the interplay between social media and politics.
In many ways, a "don't shoot me, I'm just the piano player" sensibility about the study is justified, he suggests. A vast amount of social-science research has made it "very clear that when people are building their online social networks, they're building them to reflect their offline social networks."
And offline, people live in partisan bubbles in a country that has become increasingly polarized, he adds.
But that doesn't let Facebook off the hook as the algorithm's designer, others caution.
"Selectivity has always existed. But now we're living in different world," says Dietram Scheufele, who specializes in science communication at the University of Wisconsin at Madison. Facebook "is enabling levels of selectivity that have never been possible before."
For instance, he says, research has shown that two people with identical friends will get different news feeds from each other based on the pictures the two clicked on, posts from those friends they "liked," or even something as unrelated to friends as the websites they used Facebook to log into.
Although people always have built ideological bubbles, "that doesn't mean we have to make it worse," online, he says.
Yet it's also true that people would be overwhelmed by posts if some sort of sifting wasn't done ahead of time, Northeastern's Professor Lazer acknowledges.
Perhaps the study's biggest contribution is to provoke a recognition about how much information being gathered about people is being archived and used for everything from organizing and presenting Facebook news items to setting different prices on items sold on e-commerce sites based on information gathered about the purchaser.
A lot of the algorithms that focus choices based on personal profiles "are done for our convenience, but some of it, frankly, is to exploit us," he says.
"I'm not saying we need to go back to the pre-internet age," he says. But in "Matrix" like fashion, the line between the real and virtual worlds are blurring, he adds.
"We have to think about what is good and what is bad. This study doesn't answer that question, but it does provoke the question. We're really behind where we should be in terms of debating these things as a society," he says.