Escape Your Echo Chamber
Australian Defence Force
When I was younger, I was very black and white in my thinking. Many would argue that I still am! Although I am finding the beauty in shades of grey (don’t say it!). Yet despite trying to grow my empathy and to appreciate the differences of thoughts for the constructive challenges and opportunity for growth – I still fall into the trap of comfort. I don’t always want to deal with people whose views I find offensive, idiotic, or more often than not – both. So despite being aware of the echo chamber phenomena, and its dangers, I wilfully stay with my chamber to learn and hear voices I already agree with, already support, and think very similar to myself. Is that so wrong?
Annoyingly – the answer is “it’s complicated!” Just like friendship groups, it’s nice to relax, recharge, and experience life with people we enjoy. And just like people, it’s not particularly healthy to surround yourself with toxicity that makes you feel bad about you as a person, be that social media or physical people. However, it’s equally not good to allow big data algorithms to shelter you, influence you to extreme views, and to alter our communication so that we are permanently defensive and polarised instead of open to genuine discourse and growth.
But that’s exactly what an echo chamber does. You’ve probably heard the term before, and if you’re not me, you might have heard of its less well known related “epistemic bubble” (Nguyen, 2019). An epistemic bubble is where ‘insiders’ aren’t exposed to opinions from the opposite side of an argument. While echo chambers are more extreme, where the insider doesn’t trust the opinions of the opposite side. Both aren’t great for education, growth, and innovation. But the echo chamber is easily worse! Nguyen (2018), who has done extensive research on the subject likens echo chambers to the similar dangers and trappings of a cult. Differing opinions are not only not heard, but actively discredited. I’m sure you can all think of some comment feeds filled with personal attacks and insults to dismiss a challenging point of view.
So what are some of the dangers of echo chambers and how quickly can you start to be radicalised with very little conscious awareness?
A study by Little & Richards (2021) tested social media algorithms via testing its ability to encourage and reinforce multiple forms of extremist views. In one experiment on TikTok, the test began with transphobia. Through only engaging with transphobic content, the platform began suggesting other forms of extremist and far-right videos, including white supremacy and endorsed violence against Lesbian, Gay, Bisexual, Transgender and Queer (LGBTQ) members. How long did this take? A matter of approximately 400 videos! That’s a lot of videos and would surely take a while to get through and give parents enough time to save their child from these violent circles. Except that with an average of 20 second long clips, it would feasibly take only 2 hours to be sucked into this rabbit hole of increasing hatred and violence.
I always find it dangerous when people say there’s no editorial viewpoint here or we’re not taking an editorial stand, because every list has some kind of viewpoint about what matters and what doesn’t matter (Pariser, 2018).
This conditioning isn’t taking into consideration how much data your social media already has from previous interactions, your shared web-based usage, and broader social networks. You’re already in at least an epistemic bubble – and depending on your mindset, you may be more entrenched than you think! I know on some of my platforms, I’m so deep into pictures of cottages, tea and biscuits, and beautiful books that I forgot there were other functions for the platform! And sure, creating beautiful images spaces doesn’t sound dangerous, but let’s move that casual forgetfulness to Twitter, Facebook, or the news sources you use. Upon reflection, turns out my circles there are pretty homogenous too. What about yours?
Sure, we’re not all going down the pathway of white supremacist Twitter feeds. But that does not mean we are not polarised as a nation or a globe! That we are not dismissing the genuine concerns or criticisms of ‘the other side’ just because we’re so trained not to hear it! That this kind of behaviour, right now, is leading to increasingly dangerous and violent division within our society (Does the United States Capitol riot on 6 January 2021 ring any bells?).
So – if I managed to concern you enough, how do we stop it? First, I would highly recommend The Social Dilemma documentary as a means of informing yourself (Orlowski, 2020). Understanding the ways in which social media affects you (and how it uses your data) is the first step to being able to combat the risks of being captured in an echo chamber.
But the quick version, what can you do? Well first, be conscious of it and confirmation bias in general. Do you agree with this because you already supported the idea? Does that make it truly valid? Challenge your instinctual reaction. Further to this, you need to question why you’re muting or unfollowing individuals. Are they spouting hatred and calls to violence or harmful behaviour? Okay, that’s not good. But is it just because they said one or two things that don’t align 100% with your political views?
Next, start seeking out neutral or opposing views to yourself in news feeds or in YouTube channels and actually listening to alternative opinion of commenters instead of instantly dismissing them. Perhaps you can follow a broader range of content creators. I like to follow as many authors as I can. I often don’t interact with their content (I like their books more than their tweets), but I like to think it helps broaden my Twitter algorithm a smidge!
This isn’t easy, and not always necessary for every platform. But being conscious of social media’s manipulation upon you is definitely important!