Living in the Filter Bubble
Shortly after the 2020 election results started to come in, I began to see chatter about a new social media site called Parler geared toward conservatives who felt censored by Facebook and Twitter and the mainstream social media world.
Parler bills itself as “the world’s town square” where you can “speak freely and express yourself openly, without fear of being ‘deplatformed’ for your views. Engage with real people, not bots. Parler is people and privacy-focused, and gives you the tools you need to curate your Parler experience.” Although Parler was founded in 2018, according to NPR, I had not heard of it until recently.
NPR reports that Parler places few restrictions on what users can post (although it does prohibit criminal activity, terrorism, child pron, copyright violations, fraud and spam), which has led some users to move from Facebook or Twitter to Parler after they felt they had been “censored” for posting something that the companies labeled as “mis-information.” The platform has apparently reached more than 10 million members.
While the major critique I’ve seen has focused on whether or not the platform will encourage or facilitate the spread of misinformation, my perusal of their homepage raises another significant concern to me – filter bubbles. If you’re not familiar with the idea of filter bubbles, Eli Pariser has given a great TedTalk on the topic. Basically, “filter bubble” refers to the idea that as the web and other tech services (Google, Netflix, Facebook, Amazon) increase customization, we should be concerned about the narrowing of our worldview through the filtering of information. He argues that this filtering can create our own “bubble” of confirmation bias that only shows us things that confirm and align with what we already believe, removing information that doesn’t conform with our existing beliefs and interests. Worse, he argues, we often don’t even know it’s happening.
Parler’s homepage touts its personalization. “Customize your experience,” it says, by curating and moderating your own feed, using the robust set of tools we put at your disposal. “Moderate your world” by customizing “what you see and who you interact with” or “even particular keywords you’d rather not see in comments.” (emphasis added) “We give you the tools you need to do your own ‘shadow-banning’.”
On the one hand, is this that different from selectively choosing who you friend or follow on other social media? Or blocking users that offend you or that you don’t like hearing from? We can certainly do our fair share of filtering out what we don’t want to see on other social media platforms.
However, I would argue moderating your own keywords is cause for significant concern. If I don’t believe in climate change, I can remove any discussion of it from my “parleys.” I can remove discussion of racism or diversity and inclusion, if I want. If I don’t want to see any criticism of the Affordable Care Act, I can filter out “Obamacare.” I could filter out perspectives I don’t want to see using buzzwords like “snowflake,” or by filtering out references to specific people with perspectives I don’t like. If I don’t like politics, I can filter out all the election news by filtering out “election” “Trump” and “Biden.”
I was curious, so I attempted to find out if Facebook had something similar. For what it is worth, from what I can see in Facebook’s Help Center, you can also filter specific words in the content posted on your wall if you are a page admin, but it appears the filtering options for your news feed only allow most recent and top stories, along with following/unfollowing specific people or groups. I also did not see an option to filter specific words out of posts/comments on my profile – only for pages I manage. I did not quickly find anything that would allow me to filter out particular words or concepts from my news feed, or my personal profile, only from pages I manage. It looks like there are extensions for this, though. And it looks like Instagram does allow filtering based on keywords. So this is an option on at least some more “mainstream” social media, but seems to be less prominently advertised and encouraged.
It’s hard enough to resist confirmation bias in our “real lives” without all the tech. Add in a world where Google filters your results to show you what it thinks you want. Then add a social media site that lets you explicitly remove any subjects you don’t want to see.
Suddenly we really are living in a bubble, partially of our own making and partially made by tech companies, that allows us to not only ignore what is going on in the world but to actively filter out the things that don’t fit our existing worldview. We can’t change our minds and evolve our perspective if we never see anything that offers new information.
I find this especially interesting given Parler’s emphasis on protecting and promoting free speech. This particular brand of free speech seems to mean you can say whatever you want, but don’t have to listen to anything that doesn’t fit what you want to hear. Again, this is to some extent true on any social media site, and on the internet or even the world at large for that matter, but this seems to be taking it to a new extreme I personally haven’t seen elsewhere yet. One of the primary arguments for free speech is that if we freely and completely air ideas, the best ideas will win. This can’t happen in a world where I only hear what I want and you only hear what you want.
As a society we have to resist the urge to close ourselves off to other viewpoints. One of the beautiful things about the internet is, or should be, the fact that it creates unprecedented access to information.Let’s not hamper that access by filtering it down to what we already know and the arguments we’ve already seen and believe.
We should consider whether tech companies are taking personalization too far, but we should also resist the urge to embrace that customization to an extent that might inhibit personal growth and evolving perspectives. And we need to continue to make sure the next generation understands the importance of looking at multiple perspectives, especially if it is going to be harder to do so organically as we absorb information.
Lisa Hoover is a public services librarian at Clarkson University and an adjunct professor in criminal justice at SUNY Canton. In addition to her MLS, Lisa holds a JD and an MA in political science. She began her career as an editor and then manager for a local news organization, adjunct teaching in her “spare time.” She teaches courses in criminal procedure, criminal law and constitutional law. She is passionate about First Amendment issues. She recently began her career as a librarian, starting at Clarkson University in June 2017, teaching information literacy sessions and offering reference services. Lisa and her husband Lee live in Norwood, New York, with their cats Hercules and Pandora, and pug-mix Alexstrasza (Alex).
Great point. I was directed to a Netflix documentary called The Social Dilemma. It focuses on the exact thing you are talking about in the context of social media platforms. It provides a look at how their designs are meant to focus and perpetuate your own ideas and in turn reward your behaviors.
Here is the link. https://www.thesocialdilemma.com/
Thank you! I will add this to my watch list for winter break.