Google+

Seeing is not believing: Social Media and the Echo Chamber Effect

Hypothesis testing sounds very scientific- but don’t we all engage in it? Very often we tend to formulate an opinion and throw it out in the world to test it. If I am exposed to an object ‘X’ and let’s say, I like it, I seek more information about it. It is perfectly alright to do so until I stop looking for information and start looking for evidence in support of X. I would make a terrible flaw of assuming that my hypothesis ‘X is likeable’ can be confirmed on the basis of the first bit of evidence I acquire.

This example may look harmless since you may be imagining X to be a tasty dish or perhaps just an object. However, let me ask you to assume for a moment that X is a political candidate. And the result of your hypothesis testing is going to decide the nature of the political landscape in your country for the next few years. Does it still appear to be harmless?

This is what psychologists call the confirmation bias. People tend to seek information that they consider supportive of favored hypotheses or existing beliefs. They then interpret information in ways that are partial to those hypotheses or beliefs. However, the danger of the confirmation bias is not as simple as described above. In the past few years, social media has been infamously known as echo chambers because the moment you reveal your so-called personal preferences, the application runs an algorithm to bring you closer to others with similar preferences.

In the past decade and a half, internet has been marketed with one tagline: bringing people closer. And looks like it has done its job pretty well. The only unseen condition is that the so-called people who are being brought closer confirm your opinion. This happens every time a politically aware individual share a biased meme or post. The imaginary founders of the confirmation bias can now rest peacefully in their graves.

But why does this happen? Earlier, theorists believed that humans are rational decision makers. They considered humans to weigh each alternative as carefully as possible and imagined them as maximizers. However, there are many unconscious biases that operate on the decision making process. They influence the way we attend and perceive information in order to make these choices. Thus, humans no longer engage in ‘cool consideration’ of information rather ‘hot cognition’ which is a result of these biases.

To discuss the confirmation bias, let’s consider the first time you made a Facebook account. You added people who you already knew, perhaps people who had similar opinions like you. Now, as alarming as it might sound, Facebook confirms each of your beliefs, thoughts and opinions by leading you to groups or pages which align exactly with your opinion. On a regular basis, you are now exposed to an increasing quantity of confirming information. This may convince you that what you read or watched is correct and is readily available in your memory which is called the availability heuristic conceptualized first by Kahneman and Tversky. For instance, by ‘liking’ a page about a new kind of diet, we may overestimate the probability of it succeeding, because chances are that people would share their own successes (but not failures), as well as other supportive evidences about why the diet is the best option available.

Also, the initial exposure to the favored concept anchors you to a biased position. As a result, when you are presented with contradictory information, you may not change your opinion. Thus, the initial exposure implicitly acts a reference point. This is known as the anchoring and adjustment heuristic. We might similarly overestimate the likelihood of conjunctive events presented to us, such as how well our country is doing economically. On the other hand, we may underestimate risks we first encounter- for example, by thinking that the risk of vaccination far exceeds the benefits of it. Once we believe this to be the truth, we may join Facebook groups such as Moms against vaccination which contain like-minded people, and this further confirms our existing beliefs.

So what can we do as consumers of information? The very least is to test our beliefs in an efficient fashion. Rather than looking for confirming evidence, let's open our eyes to contradicting evidence. The wise thing to do is to realize that the information we are aware of may be a tiny speck in the whole spectrum. Thus, even if we are our echo chambers feel comfortable to us, we need to be hungry for information that may not align with our beliefs, so as to be sure that we are not being forced to wear blinders by these social networking sites. And to start off with, let this article not confirm your opinion!

mini_logo.png