Posted in Psychology & Medicine

Availability Cascade

We live in a complicated world that constantly throws complex issues at us. Because it is impossible for one person to be an expert in every field, we have to employ different strategies and tactics to navigate through these issues.

A fascinating way that our brain tries to solve a current issue is the availability cascade. This is a self-reinforcing cycle, where an idea essentially “infects” a group of people, displacing individual thought and opinion and overwhelming critical thinking.

The way this happens was described and modelled by Timur Kuran and Cass Sunstein.

First, a new idea that seems to be a simple and elegant solution or explanation to the current issue starts to gain traction. People easily adopt and embrace this idea because it sounds plausible and because it is easy to process.

Secondly, people who adopt these ideas spread it themselves, making it more available in the social network. Particularly, nowadays we see this in both reported and social media.

Lastly, as the availability increases, more and more people are pulled in and the idea seems more credible, because “everybody” seems to think it. People do less research and have less individual thoughts or opinions about the matter because the group consensus is more appealing or acceptable.

The availability cascade as a platform can be very effective at raising awareness of issues and banding people together to fight a common cause, such as when the AIDS epidemic was starting.
However, it is fraught with issues.

The availability cascade is mediated by a heuristic, which is essentially a mental shortcut. Heuristics are extremely useful in that it reduces our cognitive load and automates many of our decisions. However, because they are based on rule sets, they are not as effective for new, different situations.

We are less likely to think critically when using heuristics, meaning that we are more vulnerable to being manipulated. In this situation, people think “this is widely available information, therefore it must be important” and default to believing it (even if it is just to appear “current” and to fit in).

Because critical thinking is overwhelmed by the availability cascade, it can be extremely dangerous when misinformation spreads this way; or worse, disinformation – where people maliciously spread false information for their own gains.

A classic example is the anti-vaccination movement that spawned from a discredited, falsified article that claimed MMR vaccines increased rates of autism, despite mountains of evidence pointing towards the effectiveness and safety of immunisation. Subsequently, vaccination rates dropped and we now see outbreaks of illnesses such as measles, resulting in countless deaths and injuries that could have easily been prevented.

Information can be just as contagious and dangerous as an actual infection. Knowing about the existence of these cognitive biases and phenomena help protect us from falling victim to them.

Posted in Psychology & Medicine

Confirmation Bias

We hate to be wrong. When our beliefs and ideas and knowledge are challenged, we have a strong tendency to become aggressively defensive, going as far as attacking the other person personally. It is extremely difficult trying to change someone’s opinion, because of this strong bias towards our own thoughts. This is confirmation bias.

The problem with confirmation bias is that it creates a vicious cycle, causing us to become more and more rigid in our thinking. Not only do we refuse to change our position when challenged by someone else, we actively seek out proof that we are right.

When we read or hear news or a fact, our brain has a tendency to automatically colour it according to our own beliefs. If it aligns with our beliefs, then we take it as concrete proof that we are right. If it goes against our views, we work hard to prove that there are flaws in the article, such as claiming that the writer is biased, or blatantly ignoring it, while demanding better evidence.

Social psychologist Jonathan Haidt eloquently describes this phenomenon into two questions.
When we like the proposition or fact, we ask: “Can I believe this?”. If there is even a single plausible reason, we give ourselves permission to believe it, as it reinforces our views.
However, when we don’t like it, we ask: “Must I believe this?”. Even a single, minor flaw is enough for us to discredit the new information.

This gross bias results in the difficulty of our brain to consider alternative points of view. Furthermore, we now live in the Information Era where abundant information is freely available, meaning that we can easily search up numerous other opinions that align with ours, even if the majority consensus is against us. We choose only to discuss the idea deeply with people who think like us, while fighting tooth and nail against others.

How do we overcome this incredible barrier? Like most cognitive biases, we cannot simply switch it off.

Perhaps the first step is acknowledging that we are very flawed beings that are prone to being wrong.

Then, we can catch ourselves asking “can I” versus “must I”. If we catch ourselves saying “must I believe it?”, then we should become critical of our own thinking and ask ourselves how we would respond if we instead asked the question “can I believe it?”.

At the same time, try to notice when other people are showing confirmation bias. Then, realise that is exactly how ignorant and obtuse you sound when voicing your own confirmation bias.

Finally, remember that it is okay to be wrong. If we never made any mistakes, then we would never grow. How boring would that world be?