Is Facebook reinforcing your political bias?
Loading...
At the 2006 White House Correspondents' Dinner, host Stephen Colbert famously asserted that 鈥渞eality has a well known liberal bias.鈥
His claim was in jest, but a former Facebook employee鈥檚 contention that the site鈥檚 鈥渘ews curators鈥 routinely omitted popular conservative news from its 鈥渢rending news鈥 feed has reignited a long-running debate about online news, media bias, and what political scientists say is a trend toward increasing political polarization.
For what's increasingly a primary news source for its 1 billion daily users, Facebook could be a significant influence on what is considered true in a US election year. 聽
In a report , the employee alleged that news stories featured on Facebook were selected from a small pool of trusted news sources, such as The New York Times, the BBC, or the Guardian,聽.
Facebook has denied the report,聽saying it doesn鈥檛 censor particular articles and enforces 鈥渞igorous guidelines鈥 to bar reviewers from doing so.
鈥淚 don鈥檛 know where that鈥檚 coming from,鈥 a Facebook spokesperson tells 海角大神.
The company has faced questions about its influence on politics in the past 鈥 comments by chief executive Mark Zuckerberg aimed at Donald Trump led to speculation that the site would , while a tweet from a Facebook board member that appeared to endorse colonialism in India became part of a movement to bar its Free Basics site from the country.
The allegations about the news curators, who were described by Gizmodo as a 鈥渟mall group of young journalists, primarily educated at Ivy League or private East Coast universities鈥 鈥 could further challenge the site鈥檚 longstanding claims of technological neutrality.
"Leaning Left"?
鈥淚 was really surprised,鈥 says Jason Gainous, a professor of political science at the University of Louisville. 鈥淚 hadn鈥檛 even thought about that possibility. I know their algorithm filters out based on user preferences but the idea that they鈥檙e actually filtering out their trending stories, this is not good news for them.鈥
If it is occurring, such filtering could potentially alter the views of conservative users, some say.
鈥淧eople tend to select information matching their political beliefs. If Facebook were systematically favoring one political perspective over another, then it would challenge this trend for those on one side of the political aisle,鈥 writes Natalie Jomini Stroud, an associate professor of communication at the University of Texas at Austin who directs the , in an e-mail to the Monitor.
The former Facebook news curator鈥檚 claim, which was contested by other curators interviewed by Gizmodo and , sparked a firestorm of criticism from . But the growing polarization of our news consumption may not require help from social media. Instead, it may be an outgrowth of the manner in which we consume our news, experts say.
With trust in government and a decline in belief in established information sources, including the news media, many Americans have increasingly become polarized in their political views and self-selected into like-minded communities, 聽says聽Bill Bishop, a journalist and author of 鈥淭he Big Sort: Why the Clustering of Like-Minded America is Tearing Us Apart.鈥
Increasing dominance of online news
That "clustering" tendency may be further enabled by social networking sites, which聽 as a key central destination for news. But there are some distinctions in how users seek out news online on different platforms.
A found that more than half of users of both Facebook and Twitter used the platforms as a news source for events beyond their friends and family.
But while Twitter is seen primarily as a tool for keeping up with breaking news and following their favorite outlets, reporters, and commentators,聽Facebook functions more as a forum. Its users were more likely to post and respond to content about government and politics.
Could trending news stories actually impact users鈥 political views? It鈥檚 still hard to tell.
鈥淭here is research suggesting that those selecting like-minded partisan media hold more polarized political views. It鈥檚 not clear to me whether the 鈥楾rending鈥 feature would have the same effect,鈥 writes Stroud, the communication professor in Texas.聽"What may be more likely is that the 鈥楾rending鈥 feature influences what issues people believe are most important,鈥 she says.
Gaming the news feed, or just personal preference?
Accusations of bias could be worsened by the fact that Facebook鈥檚 news feeds are lightly tailored. The trending feed also has some differences from what users see on their personal news feed, the Facebook spokesperson says.
Trending topics are generated through what users are talking about on the site, then 鈥渓ightly curated鈥 by Facebook鈥檚 review team, the company's spokesperson tells the Monitor.
鈥淧opular topics are first surfaced by an algorithm, then audited by review team members to confirm that the topics are in fact trending news in the real world and not, for example, similar-sounding topics or misnomers,鈥 writes Tom Stocky, Facebook鈥檚 vice president of search, on Monday.
Mr. Stocky also disputes a contention that the news curators artificially 鈥渋njected鈥 stories into the trending feed, including adding stories about the civil rights movement #BlackLivesMatter when they were not trending.
鈥淔acebook does not allow or advise our reviewers to systematically discriminate against sources of any ideological origin and we've designed our tools to make that technically not feasible. At the same time, our reviewers' actions are logged and reviewed, and violating our guidelines is a fireable offense,鈥 he writes.
Instead, Facebook聽聽that people see on the site are based mostly on who users鈥 friends are and what they share, not the site鈥檚 algorithm.
Using data from more than 10 million users, researchers from the company found the site鈥檚 algorithm reduces聽so-called cross-cutting material 鈥 or content that runs counter to a user鈥檚 own political views 鈥 by slightly less than 1 percent. A user鈥檚 own 鈥渇ilter bubble鈥 of friends, by contrast, reduces such content by about 4 percent.
By design, Facebook聽, with political views playing a key role, says Mr. Bishop.
鈥淭hey鈥檝e built a site that is profitable because it caters to people鈥檚 need to self-express and curate and refine their images and individual brands, and they do that within groups where they feel comfortable because everyone is like them. It鈥檚 the site for our time,鈥 he says.
Additionally, some users are making conscious decisions to attempt to influence what types of content will appear in their own news feeds.
Several 鈥渇olk theories鈥 鈥 including a 鈥淣arcissus Theory鈥 that users will see more from friends similar to them and a perspective that suggests Facebook is all powerful and unknowable 鈥 shaped how some users manipulated the site, says Karrie Karahalios, an associate professor of computer science at the University of Illinois at Urbana-Champaign.
Dr. Karahalios and several colleagues collected these folk theories together in by giving users access to an interface disclosing 鈥渟eams鈥 that provided hints into how Facebook鈥檚 algorithm works.
鈥淲e found that it got people thinking a little bit more and it got them to try things on Facebook that they wouldn鈥檛 have thought of before, they had a bit more knowledge and they had a tool set available to them that they could put action into their news feed,鈥 she says.
Editor's note: This article originally misstated the title of Jason Gainous at the University of Louisville.