The Filter Bubble: What the Internet is Hiding from You
Loading...
Google knows that my wife and I are expecting our first child. Our recent search history goes something like this: 鈥渢hings you need to buy for your first baby鈥; 鈥渨hy do my fingers get fat when I鈥檓 pregnant?鈥; and 鈥渋s it worth buying a diaper bin?鈥
I noticed that in my in-box I was getting lots of baby-related emails; the ads showing up in my Gmail account mostly pertained to infants. No matter how many different ways we searched, all avenues seemed to lead back to the same products. It was as if the Web knew what we wanted.
This is what Eli Pariser, in a fascinating new book about the increasingly personalized Internet, calls The Filter Bubble. Search engines weight our search results to our own preferences. (My search results won鈥檛 look like yours.) Sites will filter our news (without asking us) to bring us what they think we want.
Pariser, a former executive director of the advocacy group MoveOn, pulls back the curtain on the dark arts of search and Internet advertising. There is 鈥渂ehavioral retargeting,鈥 which means that you might check out a pair of shoes in an online store and leave without making a purchase 鈥 only then to find their ads following you around the Internet. Or advertising based on your 鈥減ersuasion profile,鈥 which isn鈥檛 just concerned with the types of products you like but 鈥渨hich kinds of arguments might cause you to choose one over another.鈥
With 36 percent of Americans under 30 getting their news through social-networking sites, personalization also affects the news we consume. Ever wonder why you don鈥檛 see updates from some Facebook friends in your News Feed? It鈥檚 due to an algorithm, partly based on the amount of time you spend interacting with that person.
The consequences of this social engineering, Pariser argues, is that we interact more with people who think like we do. Rather than fulfilling the early Internet dreams of diversity and freedom of choice, we are living in an echo chamber. As a result, there鈥檚 less room for 鈥渢he chance encounters that bring insight and learning.鈥 Where once we had human news editors who would temper the Britney coverage with a foreign war or two, now algorithms select our news for us based on what we click on and what we share.
The idea that the Web is an echo chamber is almost as old as the Web itself. But there still isn鈥檛 much empirical evidence to suggest that the Internet is narrowing our collective horizons. A new Pew report, 鈥淪ocial Networking Sites and Our Lives,鈥 found that there is no relationship 鈥渂etween the use of social networking services and the diversity of people鈥檚 overall social networks.鈥 Nor were Internet users less likely to consider both sides of an issue.
While not exactly a techno-pessimist, Pariser falls into the techno-pessimist鈥檚 trap of the Imagined Analogue Past. It is a rose-colored world that always forms the backdrop to books about the effects of the Internet. A world without digital distractions, with enlightening serendipitous encounters, where civic-minded news producers made sure we saw reports about famine in distant lands. In the Imagined Analogue Past we all had meaningful offline friendships, devoid of any superficiality.
But of course we never really lived like that. If our worlds are echo chambers now, what were they before, when every day we read the same newspaper, with its inherent biases in politics and scope? If the Internet is an echo chamber, what about the churches or progressive book clubs we attend? If you do live in an Internet echo chamber then that鈥檚 probably of your own making; in the pre-digital world you would have lived in one too. And anyone who has never experienced serendipity on the Internet has never been on YouTube.
Where Pariser's book is most effective is in deconstructing the myth of 鈥渄isintermediation鈥 鈥 the idea, popular among techno-utopians, that the Internet
would 鈥渇latten society, unseat the elites, and usher in a kind of global utopia,鈥 where we would no longer need gatekeepers such as newspapers, cable television, or even politicians. Pariser eloquently makes the case that we might have gotten rid of a few gatekeepers, but we've just replaced them with new ones (namely Facebook and Google).
鈥淭he Filter Bubble鈥 is less clear, though, about what we should do about it. One of Pariser鈥檚 proposals 鈥 and it's a good one 鈥 is for tech companies to make their filtering practices less opaque and be more up front about the way in which they are collecting and using our information. The author goes further, suggesting 鈥渇iltering systems to expose people to topics outside their normal experience.鈥 But is such social and civic engineering really the job of businesses like Google or Facebook? Paternalism aside, there is an irony in engineering more randomness. Google鈥檚 鈥淚鈥檓 Feeling Lucky,鈥 after all, isn鈥檛 based on luck.
Pariser writes beautifully about the new digital world in which we find ourselves but, ultimately, he doesn鈥檛 show us a future that seems to be any bleaker than the past.
Luke Allnutt is a Monitor contributor.
Join the Monitor's book discussion on and .