Can old-fashioned journalism combat fake news?
Loading...
Google 鈥淲hat is fracking?鈥 and one of the top results will be , a cleanly designed website that explains that extracting natural gas through hydraulic fracturing does not contaminate drinking water, does not pollute the air to a significant degree, and helps raise wages in local communities.
What the site doesn鈥檛 explain is who published it. The only hint is a copyright notice, in 10.5-point font at the bottom of each page, linking to 鈥渁pi.org.鈥
鈥淚 don't care what you think of fracking,鈥 says journalist, lawyer, and entrepreneur Steven Brill. But, he says,鈥測ou should know that this website, which reads like The Economist, is owned and operated and published by the American Petroleum Institute.鈥
Why We Wrote This
The prevalence of misinformation on the internet is legitimately troubling, but could attempts to remedy the problem fall prey to all-too-human biases?
Whether created by spammers, grifters, conspiracy theorists, or propagandists, sites that conceal or play down their ownership and financing, blend news with advertising, and routinely publish misinformation are widespread on the internet. And it鈥檚 not always easy to distinguish these sites from the ones operated by those acting in good faith.
鈥淭here are so many sites now that it鈥檚 hard to know which ones are credible and aren鈥檛 credible,鈥 says Lisa Fazio, a psychologist at Vanderbilt University in Nashville who studies how people process information. 鈥淚t takes a lot of effort and cognitive brainpower to really think through our prior knowledge on a topic, so we tend not to do that.鈥
Fake news is, of course, nothing new, nor is the feeling that misinformation is prevailing. Outright lies and misleading narratives masquerading as facts have persisted since the early days of the printing press. But the ease of publication in the Digital Age has made it all the more difficult for readers to sort reported, objective journalism from the chaff of hoaxers and propagandists.
Recent years have seen attempts to draw political maps of the media landscape, but efforts to alert readers to bias are susceptible to internal biases as well. For instance, , a news aggregator that presents news from across the political spectrum, labels MSNBC, the television network that in 2003 for being too liberal, as being on the far left. It places Newsmax, a conservative news site that in 2009 laid out how could be the 鈥渓ast resort to resolve the 鈥極bama problem,鈥 鈥 in an equivalent position on the right.
Other efforts to map media bias fail to capture the political stances of the publications they rate. For instance, the popular 鈥溾 created by Ad Fontes Media places the liberal-leaning online news magazine Slate to the left of the unabashedly progressive TV and radio program Democracy Now!, a rating that is laughable to anyone familiar with both news outlets.
Ordinary people are, on average, good at identifying media bias, says Gordon Pennycook, a psychologist at the University of Regina in Saskatchewan, Canada. His research, in the Proceedings of the National Academy of Sciences, found that non-experts across the political spectrum tended to rate mainstream news outlets more trustworthy than low-quality or hyperpartisan sources.
鈥淏ut,鈥 he says, 鈥渢hey aren't so good at determining the quality of mainstream sources.鈥
Other efforts to rate the credibility of news outlets rely on machine learning. In 2016, Google gave more than $170,000 to three British firms to develop automated fact-checking software.
An old-school approach
To help people distinguish the genuine from the ersatz, Mr. Brill and former Wall Street Journal publisher Gordon Crovitz created , a company that has so far produced 鈥渘utrition labels鈥 for 2,200 sites, which Brill says account for more than 96 percent of the online news content that Americans see and share. In January Microsoft鈥檚 Edge browser included NewsGuard鈥檚 technology on mobile browser (Edge users can turn it on in Settings). Desktop users running Chrome, Firefox, and other browsers can install NewsGuard as a plugin.
NewsGuard鈥檚 methodology is a decidedly old-school approach to a new problem. Instead of using algorithms or other machine-learning tools, NewsGuard has paid dozens of journalists to dig into each site and to contact news organizations for comment. The nutrition labels, which detail each site鈥檚 ownership, history, advertising policies, and editorial stance, can run more than a thousand words.
鈥淲hen we started talking to tech companies about it, they were horrified at how inefficient it is,鈥 says Brill. 鈥淚t's actually highly efficient and is the only way to achieve scale.鈥
Users with the NewsGuard extension will see a badge appearing on their browser toolbar and next to some hyperlinks 鈥 a green one with a checkmark for sites rated as credible, a red one with an exclamation point for those rated as not, and a yellow Thalia mask for satire sites like The Onion and ClickHole. Click on a badge, and you鈥檒l see how NewsGuard rates the site according to nine criteria, including objective measures like whether it clearly labels advertising or provides biographies or contact information for the writers, as well as more subjective ones like 鈥済athers and presents information responsibly.鈥
NewsGuard awards full marks to mainstream news sites like The New York Times, CNN, and The Washington Post. (海角大神 also gets top grades.) Far-right sites like Breitbart and InfoWars get failing grades. Not surprisingly, what-is-fracking.com also gets a red badge.
Human-powered, with human biases
NewsGuard鈥檚 rating system occasionally produces results that have raised eyebrows. Al Jazeera, the Qatari state-funded news outlet credited with helping to spread the 2010-11 Arab Spring protests, gets a failing grade for not disclosing its ownership and for painting Qatar in a favorable light. , a 30-year-old webzine generally held in high regard by tech journalists, is also tagged as unreliable for blurring the lines between news, opinion, and advertising, .
Because it鈥檚 powered by human beings, NewsGuard can fall prey to the same human biases that afflict news organizations. For instance, NewsGuard鈥檚 label for The New York Times includes a discussion of the 2003 Jayson Blair scandal and the discredited reporting in 1931 by Stalin apologist Walter Duranty, but it contains no mention of the paper鈥檚 reporting before the US-led invasion of Iraq, in which the Times, , was insufficiently critical in accepting official claims about weapons of mass destruction.
When asked why no mention of the pre-invasion reporting was on the label, Brill said, 鈥淚t should be there; it will end up there.鈥
Adam Johnson, an analyst for the nonprofit media watchdog Fairness and Accuracy in Reporting, says that NewsGuard fails to account for how mainstream news outlets can manufacture false narratives. 聽
鈥淚f any other country used a fake-news plugin to flag false information,鈥 he continues, 鈥渨e would call it what it is: censorship.鈥
Brill acknowledges that NewsGuard鈥檚 isn鈥檛 a panacea. 鈥淲e are not solving all the problems of the world,鈥 he says. 鈥淚f we existed in the run-up to the Iraq war, you would not have seen a red mark鈥 on the Times鈥檚 reporting on WMDs.
But, he says, his company offers an improvement over how social networks like Facebook and news aggregators like Google News determine which news sites are credible. Those companies keep their process secret, they say, so that people won鈥檛 be able to game their system.
鈥淲e love it when people game our system,鈥 says Brill. 鈥淲e now have 466 examples of websites that have changed something about what they do in order to get a higher score.鈥
鈥淭o me [NewsGuard] sounds very sensible,鈥 says Professor Pennycook. But, he says, 鈥渢he people who are going to go out of their way to install this thing, they鈥檙e not the people we鈥檙e worrying about.鈥
NewsGuard鈥檚 labels may represent a less heavy-handed way of dealing with misinformation than what some Silicon Valley companies have proposed. In 2017, for instance, Eric Schmidt, the executive chairman of Alphabet, Google鈥檚 parent company, said that it should be possible for Google鈥檚 algorithms to detect misinformation and on search-engine results pages, an approach that Mr. Johnson calls 鈥渃reepy and dystopian.鈥
鈥淚 don鈥檛 really believe in de-prioritizing,鈥 says Brill, who says he would be uncomfortable licensing his technology to companies that would hide sites flagged as unreliable. 鈥淥ur view is that people ought to have the chance to see everything.鈥
Still, NewsGuard鈥檚 ranking system, if widely adopted, would likely influence whether people choose to read or share certain stories. In a commissioned by NewsGuard, more than 60 percent of respondents said they were less likely to share stories from sites that were labeled as unreliable.
It鈥檚 this binary approach to news that rankles Johnson. 鈥淧eople aren鈥檛 children,鈥 he says. 鈥淭hey should be able to navigate information online without a US corporate, billionaire-funded report card telling them what鈥檚 real or not.鈥
Robert Matney, the director of communications for New Knowledge, an Austin-based cybersecurity company that the US Senate commissioned to investigate Russia鈥檚 efforts to influence US politics, notes that the strength of companies like NewsGuard lies not necessarily in their ratings, but in the way they educate the public.
鈥淓ncouraging news/media literacy by enabling consumers to learn more about sources is a valuable service,鈥 he writes via email.