海角大神

Apple's Tim Cook joins the fake news war

Apple CEO Tim Cook suggests that tech companies share the responsibility of fighting fake news, but will people trust them? 

|
Bernd Thissen/dpa/AP
Apple CEO Tim Cook looks on as he visits the production hall of shop fittings company Dula in Germany.

Fake news. Even the President Trump and CNN agree it鈥檚 a problem, although they might not see eye to eye on how to define it, let alone fix it.

Perceived media inaccuracy and media distrust are swirling into a perfect storm that threatens to undermine the public trust and sow seeds of doubt in the very notion of fact itself.

Apple's chief executive officer Tim Cook recently said that fake news is 鈥,鈥 perhaps signaling Apple鈥檚 intention to enter the fray alongside the likes of Facebook and the governments of Ukraine and Germany. But can a lack of faith in online information be restored with a top-down approach?

If technology and government oversight represent two fronts in the war on fake news, education is rapidly opening up a third front. A push to teach people to separate fake from real, as well as opinion from fact, might sidestep thorny issues of censorship and institutional distrust.聽

Fake news became news itself in the aftermath of the US election. At times , fake headlines proved . Despite one study finding such stories to be than a standard TV spot at changing opinions, many worried about their potential impact on the election, since the same study found that the average American saw and remembered 0.92 pro-Trump fake stories, but only 0.23 pro-Clinton ones.

While fake news skewed right during the election, it鈥檚 a bipartisan problem. Craig Silverman of Buzzfeed News, who has studied media inaccuracy for years, explained to NPR why we鈥檙e all susceptible:

and what we feel and what we already believe. It's 鈥撀爄t makes us feel good to get information that aligns with what we already believe or what we want to hear.

And on the other side of that is when we're confronted with information that contradicts what we think and what we feel, the reaction isn't to kind of sit back and consider it. The reaction is often to double down on our existing beliefs.

He then goes on to explain the role social media, especially Facebook, plays in amplifying those stories. Behind the scenes algorithms are always working to better understand what content we want to engage with, and Facebook delivers us its best guesses. The more we click on fake news stories that play on our emotions and partisanship, the more Facebook will show us. 聽

That vicious cycle is why many first looked to technology companies for a solution. Facebook, for example, has already fired its opening salvo with a feature that lets community members for third party fact checking via websites like Snopes.

Some wonder if a Facebook-endorsed fake news label might become a "" for political fringe sites, but at the very least it disqualifies those stories from receiving ad revenue, a common motivation for spammers who get paid per click.

In Tim Cook鈥檚 comments, he agreed that companies like Facebook and Apple have an important role to play, but he also highlighted a key challenge in implementing such features.

鈥淎ll of us technology companies need to create some tools that help diminish the volume of fake news. We must try to squeeze this and of the press, but we must also help the reader.鈥

Any group interested in stopping fake news quickly finds itself in the sticky position of declaring itself an arbiter of truth.

Free speech activists are already sounding the alarm that European governments may be with discussion of fines or even prison terms for spreading fake news in Italy and Germany. Even if regulators aim these tools only at fraudulent websites now, future lawmakers may have other ideas. Imagine if Trump could shut down CNN with an executive order.

In addition to the top-down approaches embraced by Facebook and European lawmakers, a new bottom-up educational strategy is gaining popularity, and was recently tested on the target of Russia鈥檚 first major misinformation campaign: Ukraine.

In response to an influx of Russian "news" surrounding the annexation of Crimea, the Ukrainian government started with outright censorship of Russian sources. Then last year, Canada partnered with nonprofit IREX to organize workshops that taught more than 15,000 Ukrainians how to spot fake news and do their own fact checking.

Preliminary results spark hope. An education start-up aimed at improving literacy through news has taken on a secondary goal of by asking questions about where articles get their facts, or what their bias is.

Research suggests that these kinds of awareness-raising questions may be effective, finding that a disclaimer reminding readers an issue is controversial can help people to read more critically, and better resist misinformation.

Entrenched partisanship make it difficult to sort news organizations into black and white categories such as 鈥渇ake鈥 or 鈥渢rustworthy,鈥 but the notion of teaching people to make that distinction for themselves could be easier to swallow for those on both the left and the right.

Cook agrees that this third front is essential.聽

鈥淚t has to be ingrained in the schools, it has to be ingrained in the public. There has to be a massive campaign. We have to think through every demographic... It鈥檚 almost as if a new course is required for the modern kid, for the digital kid.鈥

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines 鈥 with humanity. Listening to sources 鈥 with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That鈥檚 Monitor reporting 鈥 news that changes how you see the world.
QR Code to Apple's Tim Cook joins the fake news war
Read this article in
/Technology/2017/0211/Apple-s-Tim-Cook-joins-the-fake-news-war
QR Code to Subscription page
Start your subscription today
/subscribe