Social Media & Cognitive Bias

by Morgane Borzee

Welcome to the post-truth era, an era where the proliferation of fake news and conspiracy theories is only matched by the disappearance of objective standards for truth. How to make sense of things in such a polarized world? Is it becoming more and more difficult to have healthy debates over beliefs and events? A post-truth era entails circumstances by which “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”. In this article, we are going to explore how social media, cognitive bias and news media consumption result in an information trap and how to plan strategies against it. Let’s begin with some psychology principles.

Making sense of the world has always been rather a complex task, that is why often, in our attempts to make decisions and judgements we follow unconscious rules of thumb that are hardwired into our brains. We do this as a way to simplify the vast amounts of information that we are processing everyday. In psychology, the term for these mental shortcuts is called cognitive bias and they can lead to systemic errors in decision making, or potentially, extreme forms of judgement. Although we are all subject to these biases there are external factors such as the design of social media algorithms which determine what we see on those platforms, that act as a reinforcement of these errors in thinking, leading us to inaccurate judgment, misinterpretation of reality and eventually affect our day to day decisions and behaviors.

In a time where we are battling against the spread of fake news and conspiracy theories, a time where we begin questioning the manipulating capabilities of digital advertisement, it is useful to understand how cognitive bias functions and reflect on what our own biases might be. We should also be aware of how social media and the digital marketing industry feeds off these limitations in our thinking to get us to act in a way that may not be in our best interest.

Bandwagon effect

Have you ever come across an interesting post on social media, were about to like it just before noticing no other person before you liked it? What if 1k people liked it before you? How does this affect your likelihood to engage with this post? Even if it makes you doubt for a minute, that minute represents your brain going through a cognitive bias called the Bandwagon effect. Let’s have another example, maybe at some point you were walking down the street when you saw people congregating, curious about what was happening, you may have walked towards the group and asked somebody in the crowd, what is happening? “I have no idea” they respond. We can see that same idea often in the stock market or with digital currency. Everybody is buying bitcoin, should I?

As an evolutionary heuristic, we tend to want to be in the winning team, that is why, in order to economize cognitive resources, it sometimes makes sense to rely on other people’s knowledge instead of gathering information ourselves. This psychological phenomenon is also called herd mentality, and although it can be useful in specific situations for quick thinking, especially in the face of real danger, it can also lead to miscalculation.

Now let’s go back to social media and its algorithms. One popular feature is the one of trending topics, which instead of prioritizing content by it’s quality, prioritize it by its popularity. Though potentially dangerous at any time, the knock on effects of trending misinformation can be particularly harmful in the context of a pandemic. If you add to this the presence of millions of bots whose purpose is to spread unverified information, probably with a political agenda, making us think that thousands of people agree with these ideas or that there is a real grassroot movement happening, then the manipulation is clearer.

As an analogy, imagine that information about nutrition and how food affects the body was not widely available, and food brands encouraged you to consume only the things you already liked. Over time, amplified by bots whose purpose is to magnify disinformation, this would be detrimental to the health of the entire nation.

Strategies to deal with the bandwagon effect

Cognitive biases are not intrinsically bad, they exist for a reason and the bandwagon effect exists as a self-preservation mechanism. In our early beginnings as homo sapiens, if thousands of years ago your tribe started running out of nowhere you’ll probably want to run as well, maybe they saw something dangerous, we rather be safe than sorry. However we are not facing the same dangers as thousands of years ago, so in order to have a more rational thinking, especially if the stakes are high, some of these strategies may help:

Confirmation Bias

Maybe you are an Apple or a Windows fan, or you like some other operating system. It’s highly likely that you have been exposed to lots of information confirming all the benefits and perks and explaining all the disadvantages of either of those brands. Which are the ones you remember? The ones that have stuck in your brain? If you are, for example, an Apple fan, it is more likely that you will remember all the advantages of Apple and having more trouble remembering the disadvantages, the same goes the other way around. This is because our brain is hardwired to search, interpret and recall information that supports our current beliefs. 

Imagine that you attend for the first time a dancing class, however deep down you have the belief that you have no rhythm and are a bad dancer. After a few minutes you see a couple of students laughing, then the immediate thought is “they must be laughing at me because I’m a terrible dancer” although if you think of it rationally they haven’t even really looked at you and all the other students seem rather nice. With this particular bias, it is our desire to be right and our fear of being wrong that meddles with our rational thinking. Even when being right may make you miserable.

If this happens in real life now let’s look back at how social media algorithms work. One of the main promises is high personalization, your social feed is unique to you and your personal preferences, this doesn’t seem so bad right? After all you will only be exposed to information that the algorithm thinks you will like, however as you may realize, this poses a significant problem for critical thinking, especially if the main way we get news and information is through the use of Twitter, Facebook, Youtube and other highly customized content sources. 

If we are naturally biased to recall information that confirms our current belief and what is more, our main source of information is already providing us with information that aligns with our way of thinking it is highly unlikely that those beliefs could be challenged. This is the effect named “Echo chambers”, an environment where alternate perspectives do not exist. Luckily for us, once we become aware of how confirmation bias works, it becomes easier to follow some strategies to avoid the tunnel vision trap.

Strategies to deal with confirmation bias 

In the age of post-truth and fake news, one of the main aspects to focus on is how and where we consume news media. Because social media and other personalized platforms are algorithmically prone to creating these echo chambers, one of the main actions against it is to have different sources to compare different perspectives, other strategies may include:

Although these strategies can be helpful in keeping an open mind, news consumption done mainly via social media is still highly likely to be very distorted. In a study conducted by the Pew Research Center in 2020, 1 in 5 US adults say they get their political news primarily from social media, this percentage of the population has also reported to have less political knowledge than adults that consume news including other media like news websites, tv, print, radio and other means. That is why, the best strategy would be to read from a balance of sources and mediums and use tools to help with fact checking and comparison. Some of these tools are:

Now it’s up to you, coming back to the metaphor of a nutritional pyramid and now that you know different strategies, what would be your plan to have a more healthy and diversified news consumption?

1
Dark Connections

The internet is made up of interconnected pieces of data about its users. Every website has trackers installed in it, mostly belonging to Google or Facebook, that keep tabs on the people using it. This data is neither protected or encrypted, often fully accessible to anyone with the means to access it. Though these companies store our data and use it to sell their products to us, they are in no way responsible for it. This entire system is almost always not implicit and shrouded in the background of its utility. This section aims to connect these dots that exist in the dark underbelly of the internet, that we have a vague idea about, but that are not necessarily clear.
Making these connections can make the online experience feel scary and unsafe, but it already is. Although governments and large corporations are often seen as the problem, the truth is that they are far less interested in you or I than someone who knows us personally and has an agenda that involves us. This section shines a light on the dark patterns that enable your data to be collected and potentially mobilized against your interest.

2
Digital Forensics

In order to combat the practice of dark data, one can exploit the loopholes in its architecture. But in order to do this, we need to at least comprehend the full extent of the information that is collected about us. It is now possible for us to demand the data that is collected about us, though this option is not directly obvious to most people. Resources like APIs, Google Takeout, and OSINT tools allow us to conduct small-scale investigations with regards to where our data lives and what data exists about us. This section is a collection of attempts by the authors to gain access to and interpret their own data that exists online.
However, awareness of the data does not guarantee its control. Google may give us a copy of the data that exists about us in its servers through its Google Takeout service; but this does not mean that that we now own this data. Google can still use it however it likes, it has not been deleted from their databases. We are being given only an illusion of control and this is intentional. Digital Forensics can only grant us a window into this massive machine, the machinations of which may still continue to be unclear. This section explores these windows and what they teach us both about ourselves and about the technology that we utilize.

3
Data Futures

What is the future of dark data? People are increasingly aware that information about them is collected online. Governments are making efforts to regulate Big Tech and protect the privacy of citizens. How can we imagine better ways to exist in the system? How can we protect ourselves from its repercussions? This section speculates how dark data is changing as a practice. It discusses ways in which people can take action and re-examine their browsing methods. The ideas discussed here think about how technology can be used to propose solutions to the problem it has created.
It is important to consider that the practice of data collection and exploitation is ongoing. There is no easy way out of these cycles. However, we would like to believe that sparking deliberate thought and action to help you orient yourself in this Wild West landscape can make the process of coming to terms with dark data easier.

4
About

This digital edition was compiled from scholarship, research, and creative practice in spring 2021 to fulfill the requirements for PSAM 5752 Dark Data, a course at Parsons School of Design.

Editors

  • Sarah Nichols
  • Apurv Rayate

Art Directors

  • Nishra Ranpura
  • Pavithra Chandrasekhar

Technology Directors

  • Ege Uz
  • Olivier Brückner

Faculty

  • David Carroll
  • Melanie Crean

Contributors

This site needs no privacy policy because we did not install any tracking code and this site does not store any cookies.