Biblio File

Are You Getting the Whole Story? Beware the Filter Bubble

In the 1990s, libraries put significant effort into building web guides—curated and vetted lists of web resources. Ten years later, web guides went out of fashion and patrons were referred to Google, whose algorithms had grown so sophisticated vetting the web seemed unnecessary. Fifteen years later, these trusted algorithms have learned so much about our personal preferences that they may be getting in the way of our ability to see differing views on critical problems in society. In this particularly polarizing election cycle and in a time when social networks mingle so closely with news reporting, we need to be mindful of what Eli Pariser coined “the filter bubble.”

The Filter Bubble

The Filter Bubble: What the Internet is Hiding from You by Eli Parser

In a recent article in The Guardian, Katharine Viner, describes an alarming consequence social media has had on reporting.

"Algorithms such as the one that powers Facebook’s news feed are designed to give us more of what they think we want – which means that the version of the world we encounter every day in our own personal stream has been invisibly curated to reinforce our pre-existing beliefs. When Eli Pariser, the co-founder of Upworthy, coined the term “filter bubble” in 2011, he was talking about how the personalised web – and in particular Google’s personalised search function, which means that no two people’s Google searches are the same – means that we are less likely to be exposed to information that challenges us or broadens our worldview, and less likely to encounter facts that disprove false information that others have shared.

"Pariser’s plea, at the time, was that those running social media platforms should ensure that “their algorithms prioritise countervailing views and news that’s important, not just the stuff that’s most popular or most self-validating”. But in less than five years, thanks to the incredible power of a few social platforms, the filter bubble that Pariser described has become much more extreme." (Viner, Katherine. “How Technology Disrupted the Truth.” The Guardian. July 12, 2016)

Eytan Bakshy and Solomon Messing, two members of Facebook’s data science team take on part of the “filter bubble” theory in a 2015 article for Science magazine. Here is what they had to say about social media and exposure to differing viewpoints:

"Although these technologies have the potential to expose individuals to more diverse viewpoints, they also have the potential to limit exposure to attitude-challenging information, which is associated with the adoption of more extreme attitudes over time and misperception of facts about current events. This changing environment has led to speculation around the creation of “echo chambers” (in which individuals are exposed only to information from like-minded individuals) and “filter bubbles” (in which content is selected by algorithms according to a viewer’s previous behaviors), which are devoid of attitude-challenging content.

"Within the population under study here, individual choices more than algorithms limit exposure to attitude-challenging content in the context of Facebook. Despite the differences in what individuals consume across ideological lines, our work suggests that individuals are exposed to more cross-cutting discourse in social media than they would be under the digital reality envisioned by some. Rather than people browsing only ideologically aligned news sources or opting out of hard news altogether, our work shows that social media expose individuals to at least some ideologically crosscutting viewpoints."  (Bakshy, Eytan and Messing, Solomon. “Exposure to Ideologically Diverse News and Opinions on Facebook.” Science. August 13, 2015.)

The Wall Street Journal also experimented with Facebook's filter: "Blue Feed, Red Feed: See Liberal Facebook and Conservative Facebook, Side by Side."

Filter bubbles and echo chambers have not escaped the imagination of fiction writers. Here are a few cautionary tales about search engines and social media platforms becoming omniscient:

The Circle

The Circle by Dave Eggers

A chillingly plausible vision of a near-future in which social media and self quantification go monopolistic.

 

 

 

Infomocracy

Infomocracy by Malka Older

It's election year in a society where national boundaries have been broken down and replaced with a global network of microdemocracies, each consisting of 100,000 like-minded individuals. Control of each “region” is contested by various political parties and brokered by Information--an all powerful search engine.

 

 

I Am No One

I Am No One by Patrick Flanery

A thriller for the digital age that raises trenchant questions about privacy and identity.

 

 

 

 

Have trouble reading standard print? Many of these titles are available in formats for patrons with print disabilities.

Staff picks are chosen by NYPL staff members and are not intended to be comprehensive lists. We'd love to hear your ideas too, so leave a comment and tell us what you’d recommend. And check out our Staff Picks browse tool for more recommendations!