Social Media & Cognitive Bias
by Morgane Borzee
Welcome to the post-truth era, an era where the proliferation of fake news and conspiracy theories is only matched by the disappearance of objective standards for truth. How to make sense of things in such a polarized world? Is it becoming more and more difficult to have healthy debates over beliefs and events? A post-truth era entails circumstances by which “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”. In this article, we are going to explore how social media, cognitive bias and news media consumption result in an information trap and how to plan strategies against it. Let’s begin with some psychology principles.
Making sense of the world has always been rather a complex task, that is why often, in our attempts to make decisions and judgements we follow unconscious rules of thumb that are hardwired into our brains. We do this as a way to simplify the vast amounts of information that we are processing everyday. In psychology, the term for these mental shortcuts is called cognitive bias and they can lead to systemic errors in decision making, or potentially, extreme forms of judgement. Although we are all subject to these biases there are external factors such as the design of social media algorithms which determine what we see on those platforms, that act as a reinforcement of these errors in thinking, leading us to inaccurate judgment, misinterpretation of reality and eventually affect our day to day decisions and behaviors.
In a time where we are battling against the spread of fake news and conspiracy theories, a time where we begin questioning the manipulating capabilities of digital advertisement, it is useful to understand how cognitive bias functions and reflect on what our own biases might be. We should also be aware of how social media and the digital marketing industry feeds off these limitations in our thinking to get us to act in a way that may not be in our best interest.
Bandwagon effect
Have you ever come across an interesting post on social media, were about to like it just before noticing no other person before you liked it? What if 1k people liked it before you? How does this affect your likelihood to engage with this post? Even if it makes you doubt for a minute, that minute represents your brain going through a cognitive bias called the Bandwagon effect. Let’s have another example, maybe at some point you were walking down the street when you saw people congregating, curious about what was happening, you may have walked towards the group and asked somebody in the crowd, what is happening? “I have no idea” they respond. We can see that same idea often in the stock market or with digital currency. Everybody is buying bitcoin, should I?
As an evolutionary heuristic, we tend to want to be in the winning team, that is why, in order to economize cognitive resources, it sometimes makes sense to rely on other people’s knowledge instead of gathering information ourselves. This psychological phenomenon is also called herd mentality, and although it can be useful in specific situations for quick thinking, especially in the face of real danger, it can also lead to miscalculation.
Now let’s go back to social media and its algorithms. One popular feature is the one of trending topics, which instead of prioritizing content by it’s quality, prioritize it by its popularity. Though potentially dangerous at any time, the knock on effects of trending misinformation can be particularly harmful in the context of a pandemic. If you add to this the presence of millions of bots whose purpose is to spread unverified information, probably with a political agenda, making us think that thousands of people agree with these ideas or that there is a real grassroot movement happening, then the manipulation is clearer.
As an analogy, imagine that information about nutrition and how food affects the body was not widely available, and food brands encouraged you to consume only the things you already liked. Over time, amplified by bots whose purpose is to magnify disinformation, this would be detrimental to the health of the entire nation.
Strategies to deal with the bandwagon effect
Cognitive biases are not intrinsically bad, they exist for a reason and the bandwagon effect exists as a self-preservation mechanism. In our early beginnings as homo sapiens, if thousands of years ago your tribe started running out of nowhere you’ll probably want to run as well, maybe they saw something dangerous, we rather be safe than sorry. However we are not facing the same dangers as thousands of years ago, so in order to have a more rational thinking, especially if the stakes are high, some of these strategies may help:
- Make your reasoning explicit: Start by asking “Why am I deciding/believing this? What are the specific facts and grounds making me take a particular decision/position? It can also help out to write the reasons down and then see if they hold together. Slowing down the process will help you think in a more analytical and less passionate way.
- Identify if there are specific promoters to the idea: Who is the real author behind the piece of information that I’m considering, do they have a particular goal in mind, are they benefiting in some way of me taking this decision/position?
- Hold yourself accountable: It is not because many people do it that it automatically makes it right or that it will have less consequences. Thinking about the potential impact of your actions and comparing it to the consequences of other possible alternatives will make your reasoning less prone to error.
Confirmation Bias
Maybe you are an Apple or a Windows fan, or you like some other operating system. It’s highly likely that you have been exposed to lots of information confirming all the benefits and perks and explaining all the disadvantages of either of those brands. Which are the ones you remember? The ones that have stuck in your brain? If you are, for example, an Apple fan, it is more likely that you will remember all the advantages of Apple and having more trouble remembering the disadvantages, the same goes the other way around. This is because our brain is hardwired to search, interpret and recall information that supports our current beliefs.
Imagine that you attend for the first time a dancing class, however deep down you have the belief that you have no rhythm and are a bad dancer. After a few minutes you see a couple of students laughing, then the immediate thought is “they must be laughing at me because I’m a terrible dancer” although if you think of it rationally they haven’t even really looked at you and all the other students seem rather nice. With this particular bias, it is our desire to be right and our fear of being wrong that meddles with our rational thinking. Even when being right may make you miserable.
If this happens in real life now let’s look back at how social media algorithms work. One of the main promises is high personalization, your social feed is unique to you and your personal preferences, this doesn’t seem so bad right? After all you will only be exposed to information that the algorithm thinks you will like, however as you may realize, this poses a significant problem for critical thinking, especially if the main way we get news and information is through the use of Twitter, Facebook, Youtube and other highly customized content sources.
If we are naturally biased to recall information that confirms our current belief and what is more, our main source of information is already providing us with information that aligns with our way of thinking it is highly unlikely that those beliefs could be challenged. This is the effect named “Echo chambers”, an environment where alternate perspectives do not exist. Luckily for us, once we become aware of how confirmation bias works, it becomes easier to follow some strategies to avoid the tunnel vision trap.
Strategies to deal with confirmation bias
In the age of post-truth and fake news, one of the main aspects to focus on is how and where we consume news media. Because social media and other personalized platforms are algorithmically prone to creating these echo chambers, one of the main actions against it is to have different sources to compare different perspectives, other strategies may include:
- Putting some distance between your beliefs and your identity: when we are so entangled with our own beliefs, any different perspective may seem as an attack on us and put us in a defensive mode, preventing us from thinking more rationally. A way of doing this is by thinking in probabilities, this invites us to keep an open mind and allows others to express their opinion without fear of judgement.
- Actively seeking opposing data and following reliable sources on all sides: Following sound sources with different perspectives and doing so with an open mind will bring diversity in the information you consume, helping to achieve a more critical way of thinking. It can also be wise to like all sorts of posts, in that way the algorithm will get confused and start bringing more assorted content.
- Favor recency over personalization for your feeds: Although buried deep down, for many social media platforms there is a way to change your feed so that the more recent content is the one you see first instead of seeing only the personalized feed. Although these platforms will probably warn you against not using their powerful algorithms you may notice more variety and randomness that will actually help prevent the echo-chamber effect.
Although these strategies can be helpful in keeping an open mind, news consumption done mainly via social media is still highly likely to be very distorted. In a study conducted by the Pew Research Center in 2020, 1 in 5 US adults say they get their political news primarily from social media, this percentage of the population has also reported to have less political knowledge than adults that consume news including other media like news websites, tv, print, radio and other means. That is why, the best strategy would be to read from a balance of sources and mediums and use tools to help with fact checking and comparison. Some of these tools are:
- AllSides A news aggregator that facilitates comparison by sharing articles from the left, center, and right side by side. Categories are determined by users and therefore reflect public perception biases as opposed to content-level bias.
- FactCheck.org A nonpartisan, nonprofit “consumer advocate” that monitors the factual accuracy of what is said publicly by major U.S. political players. A project of the Annenberg Public Policy Center of the University of Pennsylvania.
- Is This True? A fake news database created and maintained by Politico.
Now it’s up to you, coming back to the metaphor of a nutritional pyramid and now that you know different strategies, what would be your plan to have a more healthy and diversified news consumption?