Is Facebook reinforcing your political bias?

Facebook disputes a former employee's claim that editors excluded conservative stories from a 'trending' feed, but the site reflects a broader trend of growing political polarization, researchers say.

|
Beck Diefenbach/Reuters/File
The sun rises behind the entrance sign to Facebook headquarters in Menlo Park, Calif., on May 18, 2012. The social network is facing accusations that it removed stories on conservative issues and figures from its "trending" news feed, a controversy that reflects an increase trend toward political polarization, some observers say.

At the 2006 White House Correspondents' Dinner, host Stephen Colbert famously asserted that “reality has a well known liberal bias.”

His claim was in jest, but a former Facebook employee’s contention that the site’s “news curators” routinely omitted popular conservative news from its “trending news” feed has reignited a long-running debate about online news, media bias, and what political scientists say is a trend toward increasing political polarization.

For what's increasingly a primary news source for its 1 billion daily users, Facebook could be a significant influence on what is considered true in a US election year.  

In a report published Monday by the tech site Gizmodo, the employee alleged that news stories featured on Facebook were selected from a small pool of trusted news sources, such as The New York Times, the BBC, or the Guardian, at the exclusion of several others.

Facebook has denied the report, saying it doesn’t censor particular articles and enforces “rigorous guidelines” to bar reviewers from doing so.

“I don’t know where that’s coming from,” a Facebook spokesperson tells The Christian Science Monitor.

The company has faced questions about its influence on politics in the past – comments by chief executive Mark Zuckerberg aimed at Donald Trump led to speculation that the site would seek to influence the 2016 election, while a tweet from a Facebook board member that appeared to endorse colonialism in India became part of a movement to bar its Free Basics site from the country.

The allegations about the news curators, who were described by Gizmodo as a “small group of young journalists, primarily educated at Ivy League or private East Coast universities” – could further challenge the site’s longstanding claims of technological neutrality.

"Leaning Left"?

“I was really surprised,” says Jason Gainous, a professor of political science at the University of Louisville. “I hadn’t even thought about that possibility. I know their algorithm filters out based on user preferences but the idea that they’re actually filtering out their trending stories, this is not good news for them.”

If it is occurring, such filtering could potentially alter the views of conservative users, some say.

“People tend to select information matching their political beliefs. If Facebook were systematically favoring one political perspective over another, then it would challenge this trend for those on one side of the political aisle,” writes Natalie Jomini Stroud, an associate professor of communication at the University of Texas at Austin who directs the Engaging News Project, in an e-mail to the Monitor.

The former Facebook news curator’s claim, which was contested by other curators interviewed by Gizmodo and The Guardian, sparked a firestorm of criticism from some conservative news sites. But the growing polarization of our news consumption may not require help from social media. Instead, it may be an outgrowth of the manner in which we consume our news, experts say.

With trust in government peaking in the mid-1960s and a decline in belief in established information sources, including the news media, many Americans have increasingly become polarized in their political views and self-selected into like-minded communities,  says Bill Bishop, a journalist and author of “The Big Sort: Why the Clustering of Like-Minded America is Tearing Us Apart.”

Increasing dominance of online news

That "clustering" tendency may be further enabled by social networking sites, which continue to usurp broadcast news and newspapers as a key central destination for news. But there are some distinctions in how users seek out news online on different platforms.

A study from the Pew Research Center found that more than half of users of both Facebook and Twitter used the platforms as a news source for events beyond their friends and family.

But while Twitter is seen primarily as a tool for keeping up with breaking news and following their favorite outlets, reporters, and commentators, Facebook functions more as a forum. Its users were more likely to post and respond to content about government and politics.

Could trending news stories actually impact users’ political views? It’s still hard to tell.

“There is research suggesting that those selecting like-minded partisan media hold more polarized political views. It’s not clear to me whether the ‘Trending’ feature would have the same effect,” writes Stroud, the communication professor in Texas. "What may be more likely is that the ‘Trending’ feature influences what issues people believe are most important,” she says.

Gaming the news feed, or just personal preference?

Accusations of bias could be worsened by the fact that Facebook’s news feeds are lightly tailored. The trending feed also has some differences from what users see on their personal news feed, the Facebook spokesperson says.

Trending topics are generated through what users are talking about on the site, then “lightly curated” by Facebook’s review team, the company's spokesperson tells the Monitor.

“Popular topics are first surfaced by an algorithm, then audited by review team members to confirm that the topics are in fact trending news in the real world and not, for example, similar-sounding topics or misnomers,” writes Tom Stocky, Facebook’s vice president of search, in a post on the site on Monday.

Mr. Stocky also disputes a contention that the news curators artificially “injected” stories into the trending feed, including adding stories about the civil rights movement #BlackLivesMatter when they were not trending.

“Facebook does not allow or advise our reviewers to systematically discriminate against sources of any ideological origin and we've designed our tools to make that technically not feasible. At the same time, our reviewers' actions are logged and reviewed, and violating our guidelines is a fireable offense,” he writes.

Instead, Facebook has argued that the types of stories that people see on the site are based mostly on who users’ friends are and what they share, not the site’s algorithm.

Using data from more than 10 million users, researchers from the company found the site’s algorithm reduces so-called cross-cutting material – or content that runs counter to a user’s own political views – by slightly less than 1 percent. A user’s own “filter bubble” of friends, by contrast, reduces such content by about 4 percent.

By design, Facebook encourages users to self-select, with political views playing a key role, says Mr. Bishop.

“They’ve built a site that is profitable because it caters to people’s need to self-express and curate and refine their images and individual brands, and they do that within groups where they feel comfortable because everyone is like them. It’s the site for our time,” he says.

Additionally, some users are making conscious decisions to attempt to influence what types of content will appear in their own news feeds.

Several “folk theories” – including a “Narcissus Theory” that users will see more from friends similar to them and a perspective that suggests Facebook is all powerful and unknowable – shaped how some users manipulated the site, says Karrie Karahalios, an associate professor of computer science at the University of Illinois at Urbana-Champaign.

Dr. Karahalios and several colleagues collected these folk theories together in a recently published paper by giving users access to an interface disclosing “seams” that provided hints into how Facebook’s algorithm works.

“We found that it got people thinking a little bit more and it got them to try things on Facebook that they wouldn’t have thought of before, they had a bit more knowledge and they had a tool set available to them that they could put action into their news feed,” she says.

Editor's note: This article originally misstated the title of Jason Gainous at the University of Louisville.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Is Facebook reinforcing your political bias?
Read this article in
https://www.csmonitor.com/Technology/2016/0511/Is-Facebook-reinforcing-your-political-bias
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe