Pugwash: Filters isolate Internet users
At a recent Pugwash meeting, the discussion group examined the social implications of filter bubbles, a factor which isolates Internet users from outside viewpoints.
Information bubbles have always existed. For example, our communities isolate us from outside ideas to a certain extent. Many hoped the Internet would bring us into a larger, more diverse community, exposing us to a larger universe of ideas. Paradoxically, in many ways it has had the opposite effect.
While the Internet allows access to new ideas, it also gives us greater access to others sharing our own beliefs. People continue to isolate themselves within small groups, but now these groups can be even more homogeneous than ever. Twenty years ago there may have been two or three newspapers to choose from. With each newspaper serving diverse subscribers, everyone in a community read roughly the same news. Now that it is possible to choose between hundreds of electronic subscriptions, the news that two different people read may have little in common.
A completely new factor keeping users isolated from differing viewpoints is the personalized content that platforms such as Google, Facebook, and Yahoo News produce. These platforms customize what we see based on our location, Internet browser, computer type, frequency of travel, who our friends are, our search history, and what links we have clicked on in the past. All this information goes into determining our search results, whose updates will show up in our Facebook feed, and what articles will be in our Yahoo News headlines. As Eric Schmidt, executive chairman of Google, commented, “[Soon] it will be very hard for people to watch or consume something that has not in some sense been tailored for them.”
This type of personalization is convenient, but creates filter bubbles in which users are isolated from content they disagree with or would find uncomfortable. What is particularly worrying about filter bubbles is that few users are aware that they are in one. If it were not for filter bubbles, users would be aware of outside information, even if they chose not to examine it. However, with personalized results, users may honestly believe that no debate exists over controversial topics, or that raging debate exists over settled ones. There are important discussions to be had on how filter bubbles might affect civic discourse, people’s susceptibility to propaganda, and whether they might lead to greater divides within our society.
Personalized content gives us what we want to see, but is that content what is best for us? Filter bubbles also raise the question of whether the providers of our Internet content have a civic responsibility to show us just what we want, or to leave open windows into the wider world. One of our members suggested adding scalable options to search engines for the amount of filtering applied to uncomfortable content and for alternative points of view.
If you would like to escape your filter bubble, you can try the popular alternative search engine DuckDuckGo, which does not individually customize results. You might also try subscribing to a large mainstream newspaper, in lieu of personalized headlines, such as those from Yahoo News. Perhaps most importantly, make an effort to click on articles on uncomfortable topics or ones that challenge your point of view. After all, filter bubbles are only the streamlined process of the filtering which we already engage in.