Posted in Psychology & Medicine

Availability Cascade

We live in a complicated world that constantly throws complex issues at us. Because it is impossible for one person to be an expert in every field, we have to employ different strategies and tactics to navigate through these issues.

A fascinating way that our brain tries to solve a current issue is the availability cascade. This is a self-reinforcing cycle, where an idea essentially “infects” a group of people, displacing individual thought and opinion and overwhelming critical thinking.

The way this happens was described and modelled by Timur Kuran and Cass Sunstein.

First, a new idea that seems to be a simple and elegant solution or explanation to the current issue starts to gain traction. People easily adopt and embrace this idea because it sounds plausible and because it is easy to process.

Secondly, people who adopt these ideas spread it themselves, making it more available in the social network. Particularly, nowadays we see this in both reported and social media.

Lastly, as the availability increases, more and more people are pulled in and the idea seems more credible, because “everybody” seems to think it. People do less research and have less individual thoughts or opinions about the matter because the group consensus is more appealing or acceptable.

The availability cascade as a platform can be very effective at raising awareness of issues and banding people together to fight a common cause, such as when the AIDS epidemic was starting.
However, it is fraught with issues.

The availability cascade is mediated by a heuristic, which is essentially a mental shortcut. Heuristics are extremely useful in that it reduces our cognitive load and automates many of our decisions. However, because they are based on rule sets, they are not as effective for new, different situations.

We are less likely to think critically when using heuristics, meaning that we are more vulnerable to being manipulated. In this situation, people think “this is widely available information, therefore it must be important” and default to believing it (even if it is just to appear “current” and to fit in).

Because critical thinking is overwhelmed by the availability cascade, it can be extremely dangerous when misinformation spreads this way; or worse, disinformation – where people maliciously spread false information for their own gains.

A classic example is the anti-vaccination movement that spawned from a discredited, falsified article that claimed MMR vaccines increased rates of autism, despite mountains of evidence pointing towards the effectiveness and safety of immunisation. Subsequently, vaccination rates dropped and we now see outbreaks of illnesses such as measles, resulting in countless deaths and injuries that could have easily been prevented.

Information can be just as contagious and dangerous as an actual infection. Knowing about the existence of these cognitive biases and phenomena help protect us from falling victim to them.

1+