Google updates algorithm to filter out Holocaust denial and hate sites

Under increasing pressure over the spread of fake news, some social media and web search companies are moving towards valuing information integrity in their results.

|
Virginia Mayo/AP/File
The Google logo is seen at the Google headquarters in Brussels.

Most internet users see Google as a simple portal to all information online. But recently, users using search terms related to the Holocaust or ethnic minorities found a disturbing trend: top results would lead to hate-filled sites.

In order to correct this problem and stem the flood of misinformation getting to users of their popular search engine, Google has changed its search algorithms to prioritize high-quality information, bumping down sites associated with racial hate speech, and to remove anti-Semitic auto-fill queries.

Google has shown reluctance to change its algorithms in the past, preferring to prioritize whatever pages generated the most online sharing and discussion. But instead of providing objective results, Google's algorithms were being manipulated to amplify misinformation and hate speech, reported The Guardian's Carole Cadwalladr in early December. 

The changes to Google come after reports that one of the auto-fill suggestions to complete the search query "are Jews" included "are Jews evil?" Also, the top search for "did the Holocaust happen" linked to a page by Stormfront, an infamous white supremacist group, and searches related to various ethnic minorities would often bring up other sites espousing racist views.

"Judging which pages on the web best answer a query is a challenging problem and we don't always get it right," a Google spokesperson told Fortune. "We recently made improvements to our algorithm that will help surface more high quality, credible content on the web. We'll continue to change our algorithms over time in order to tackle these challenges."

While the Fortune article indicated that the algorithm had kicked in to replace the Stormfront result with a link to the United States Holocaust Museum, this reporter's search still found the white supremacist group in the number one spot, indicating that the changes may not be universal yet.

The apparent increase in hate speech and the glut of fake news brought to national attention during the presidential election, in particular, have caused many to step back and make a sober reevaluation of the internet's role in shaping perceptions of reality. According to a Pew Research poll, four out of ten Americans now get news online, underscoring the influence such sites can yield. 

"Companies that control large segments of the internet, such as Google and Facebook, create 'filter bubbles' because of the algorithms used to present us with data tailored to our habits, beliefs, and identities," Melissa Zimdars, a professor of communications and media at Merrimack College, who has catalogued fake news sources, tells The Christian Science Monitor in an email.

"Our behaviors on the internet create a tremendous amount of data about us, and that data is used to tailor search results and our Facebook feeds based on what these companies perceive we want rather than what we may need," she explains.

Over thousands of interactions, this system encourages more sensational stories and websites to pop up in suggested feeds, regardless of their accuracy or origins. 

For some, the idea of filtering out inaccurate top results smacks of censorship. But when thinking about what that means, it's important to remember that "most censorship and filtering – at least, in the US – is usually self-imposed," Nicholas Bowman, a professor of communication studies at West Virginia University, explains in an email to the Monitor. "Movie and TV ratings, for example are set by industry groups, as was the old Comics Code Authority. Essentially, these forms of entertainment were threatened with government sanction and standards unless they themselves could find a way to self-regulate information, and those industries responded in kind."

"What does potentially become a problem, of course, is when those companies begin deciding what is and isn't appropriate, and those decisions are made arbitrarily – or at least, don't match up with larger public sentiment," he adds. 

Dr. Bowman suggests a system similar to Wikipedia's setup as a possible solution to maintaining informational integrity on the internet: a mixture of crowdsourced information like the popularity-driven system the sites currently employ, coupled with authentication from outside sources.

Dr. Zimdars emphasizes that the solution to online hate requires transparency.

Google has "the power to stymie hate speech circulated by hate groups, but this means that all kinds of alternative ideas could be limited through tweaks to its algorithm," she says. "Overall, we need a lot more transparency about why we're seeing what we're seeing, and perhaps more importantly, more knowledge about what we're not seeing."

As information is shared across the internet, Zimdars says, it often becomes "cleansed," disconnected from its original source and normalized into mainstream conversation. This can be a problem, particularly when the hate at the core of racist or biased messages becomes assimilated into social media platforms through the "impartial" algorithms of Facebook, Twitter, and Google.

"Perhaps we tricked ourselves into thinking that there is no more hate, because social norms tended to govern conversations such that most people didn't share these thoughts face-to-face," says Bowman.

"They exist, and I'd say that they are not so much 'stronger than ever' as they are 'louder than ever.' "

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Google updates algorithm to filter out Holocaust denial and hate sites
Read this article in
https://www.csmonitor.com/Technology/2016/1221/Google-updates-algorithm-to-filter-out-Holocaust-denial-and-hate-sites
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe