Posted in Psychology & Medicine

Maslow’s Hierarchy Of Needs

Abraham Maslow was a Jewish psychologist who tried to answer a question that plagues every person at some stage: what is the meaning of life? To answer this question, he published a paper called A Theory of Human Motivation, where he introduced the now well-known Hierarchy of Needs. The basic premise to Maslow’s theory is as follows.

We have different needs in life. Maslow’s Hierarchy of Needs categorises these needs, then places them in a pyramid-shaped model in order of priority. Maslow believed that some needs are more fundamental than others. For example, you can’t worry about being single if you are starving to death. Therefore, to be motivated to work on one category, you must first satiate your need for the category below that. Maslow organised the categories in the following order.

Starting from the bottom of the pyramid, we have physiological needs. This is self-explanatory, as you need to be biologically alive to even worry about the other needs. This includes food, water, warmth and rest.

The next level addresses safety. If you do not feel safe, then you would be too preoccupied by the sense of danger to consider higher needs. Therefore, you need physical shelter, resources and a general sense of security, whether it be personal safety, financial, health or emotional security.

Safety and physiological needs are considered “basic needs“. The next two are considered “psychological or spiritual needs“.

Social belonging refers to the human need for connection. Loneliness and disconnect can be crippling to the point that you cannot enjoy the other aspects of your life, even if you have your basic needs met. This includes romantic and intimate relationships, family and friends, and communities.

Once we fulfil our need for external connections, we can start looking within ourselves, addressing our need for self-esteem and self-respect. We cannot lead fulfilling lives if we doubt and are unkind to ourselves.

Lastly, we have the apex of the pyramid that Maslow thought all people should ultimately aspire to: self-actualisation. Essentially, this means being the best version of yourself that you can be, unlocking your full potential and making the most out of your life.

The interesting part to this last step is that you define what the best version of yourself is. Perhaps you wish to be a great parent or a teacher. Perhaps you want to be a high-achieving professional or to create something others can enjoy. Perhaps you wish to be content and happy.

The Hierarchy of Needs suggests that to even think about achieving self-actualisation, we must fulfil the more basic needs first. This means that in some cases, what gets in the way of our self-actualisation may not be us, but our environment. For example, child abuse and domestic violence greatly affect a person’s sense of safety and causes significant trauma. Being socially isolated or having low self-esteem are all barriers to letting you be you. So how do we escape this trap?

First, evaluate whether you truly don’t have the basic needs. We often misjudge what we actually need in life, choosing to focus on things that won’t bring us joy, such as gaining more material wealth than needed, or social attention. On retrospect, we may find that we already have everything we need to ascend to the next level.

Second, if something is in your control, take action to remove the obstacle. This might involve changing your perspective, modifying how you do things or communicating with another person why things are not working. If you are in a toxic relationship or a job that you loathe, you may have to leave them to let yourself progress. We have much greater power over our lives than we think, but our fears, doubts and social pressures convince us otherwise.

Third, remember that Maslow’s Hierarchy of Needs is not the one-truth. There have been countless studies showing that Maslow’s suggested order of priorities do not apply in the real world, with many people opting to prioritise higher needs above basic needs, such as willingly staying hungry in order to pursue creative outlets, or giving up a secure, stable life in the pursuit of love. It may be difficult, but we can sometimes transcend the challenges of our environment through determination.

Maslow’s Hierarchy of Needs has been controversial in the field of psychology ever since its publication, but it is a good reminder that to achieve a happy, fulfilling life, we need to take stock of what we truly need in life and balance them with each other.

Posted in Psychology & Medicine

Holistic Medicine

At face value, medicine appears to work on a relatively simple model. You gather information through history taking, clinical examination and investigations such as lab tests and imaging. Then, you narrow down the differential diagnosis to the single most likely diagnosis. Lastly, you treat the diagnosis as per the recommended treatment guidelines.

But if you ask anyone who works in healthcare, they will all know that is not the whole truth. There are so many other factors and variables that play in to the management of a patient that the model above does not address.

For example, you may diagnose a skin infection and prescribe antibiotics, but the person may not have enough disposable income to pay for the medications. You may come up with a plan for the patient to come in to clinic in a week’s time for a review, but they may not have transport or someone to look after their children so that they can come in. You may diagnose that there is nothing medically wrong with the patient, but they may still be worried that they have a serious condition that killed their father.

In medicine, you do not treat the disease; you treat the patient. It is easy to get so focussed on the clinical picture that the overall context is lost. This leads to incomplete care, which causes a variety of issues ranging from patient dissatisfaction to recurrent presentations.

Although it may seem difficult and time-consuming to pay attention to these extra details, it almost always pays off in one way or another. Addressing a patient’s troubled social situation may reduce the number of times they present to hospital, saving significant costs. The doctor taking the time to reassure the patient that their symptom is not concerning for a significant illness may let the patient sleep comfortably at night. Talking through the patient or their family’s concerns and questions might make the worst day of their lives slightly more tolerable.

This approach is useful outside of the hospital too. When you face a problem, regardless of the type, instead of trying to come up with a quick fix to patch it up, try to consider the context of the problem. You may discover that there is a deeper, more fundamental cause of the problem that needs fixing.

Posted in Psychology & Medicine

Frisson

Have you ever listened to a song or watched a scene in a movie where you suddenly feel a chill run through your body, giving you goosebumps? This is a well-recognised phenomenon called frisson (“shiver” in French). Frisson is colloquially known as “the chills”, thrills, goosebumps, or “skin orgasm”.

Frisson is described as a rapid, intense wave of pleasure, accompanied by tingling and chills spreading through your skin. It is typically triggered by an unexpected, sudden change in the dynamic of a musical piece. This may include a change in loudness, pitch, melody, unexpected harmonies or an appoggiatura in the melody, where there is an accentuated note that does not fit in the chord, creating a clash. If a person is emotionally connected to the piece, such as having a fond memory associated with it, the intensity of frisson is heightened.

Scientifically speaking, frisson is the combination of the reward centre in your brain releasing dopamine, plus the activation of your autonomic nervous system. This results in pupil dilation, piloerection (goosebumps) and increased electrical conductance of your skin, similar to when you have an adrenaline rush.

It is likely the result of your brain being confused by an unexpected change from the predicted progression of the music, causing a strange blend between the pleasure of surprise and fear of the uncertain.

Not everyone experiences frisson. Studies show that around 55-85% of the population have felt frisson before. One study showed that those with the personality trait “openness to experience” have a higher chance of feeling frisson. These people tend to have more intense emotions, active imaginations and are intellectually curious. One possible explanation for why these characteristics allow for frisson is that you need to be in tune with your emotions and the present to appreciate the subtle but sudden dynamic changes that result in frisson.

The potential joy of feeling frisson is yet another benefit of being mindful of your emotions and the present.

(Here’s a video of something that gives me frisson every time I watch it.)

Posted in Psychology & Medicine

Cobra Effect

While colonising India, the British government became concerned about venomous cobra snakes causing a public safety issue in Delhi. To remedy this situation, they decided to use the people as cheap labour by offering a bounty if anyone brought in a dead cobra. They thought this would be a cost effective method of reducing the cobra population.

The strategy was initially a success, with a huge number of cobra snakes being killed for the reward. But then, something unexpected happened. People soon caught on that it did not matter where the cobra snakes came from, as long as it was dead. Therefore, they abused this loophole by breeding cobra snakes and then killing them for even more reward. The British government found out about this enterprise eventually and decided to scrap the program.

With no reason to have so many cobra snakes, the breeders decided to release the cobras. Ultimately, Delhi’s cobra population was now larger than when the program was initiated.

This is the cobra effect. Sometimes, an idea may seem novel and efficient, but human psychology can easily turn it on its head and make a problem worse than before.

A similar, but much more macabre, phenomenon happened in Edinburgh, Scotland, in 1828. At the time, anatomy was a hot new field of research, so human cadavers were in great demand by the universities, doctors and scholars. Due to a Scottish law stating that cadavers could only come from deceased prisoners, orphans and suicide victims, there was very limited supply. Following the economic laws of supply and demand, the price of a human cadaver rose more and more. “Body snatching” became a popular crime, where people exhumed corpses from graveyards and sold them for a profit.

Two men by the names of William Burke and William Hare took things one step further. The two ran a lodging house, where a tenant passed away suddenly, while owing rent. To cover the owed amount, they stole the body before the burial and went to Edinburgh University, where they sold the body to an anatomist named Robert Knox. On hearing that bodies were in great demand and that they would be paid handsomely for any more cadavers, they hatched a sinister plan.

They realised that since their “clients” did not care about where the body came from, they could easily source them through murder. Over the course of a year, they murdered at least 16 people at their lodge and sold their corpses to Robert Knox for dissection. Their choice method of murder was to wrestle down and sit on the victim’s chest to asphyxiate them (now called “burking”), as strangling, choking or using a sharp instrument would reduce the corpse’s value due to the damage.

The pair were eventually caught and sentenced to death. Hare was eventually released, but Burke was hanged and ironically, his skeleton was preserved and exhibited at the Anatomical Museum of the Edinburgh Medical School.

Posted in Psychology & Medicine

Analgesic Ladder

Quite possibly the most common condition that a physician needs to treat is pain. Being the main way for the body to communicate that there is something wrong, pain can take various forms to make us suffer physically. The best way to make this pain go away is to treat the underlying cause, but often the cause is unclear and we need to manage the symptoms first.

Just as there are many kinds of pain, there are numerous different types of analgesics, or painkillers. Doctors and nurses take into account various factors to decide which analgesia to use, how much to give and how often to give it. For example, opioids (e.g. morphine) are one of the most effective pain-reliefs, but it comes with many adverse effects such as vomiting, constipation, drowsiness, slowing of breathing and potentially death. To facilitate this, the World Health Organisation created the concept of the “Analgesic Ladder”, establishing some simple rules to guide appropriate analgesia administration.

The ladder has been adapted to accommodate for new research and advancing pain-relief methods, but the general principle remains the same.

First, simple non-opioid medications should be given orally and regularly. Almost always, the first-line analgesia is paracetamol (acetaminophen in USA). It is an effective pain-relief, especially when it is taken regularly four times a day, while being extremely safe as long as it is not taken above the maximum dosage (4 grams/day). As effective as it is, people often neglect to take it regularly as directed, or take it too late when the pain has progressed to a severe level, hence the common misunderstanding that it is weak.

The next step of non-opioid medications are non-steroidal anti-inflammatories (NSAIDs), such as ibuprofen or diclofenac. These medications work particularly well for musculoskeletal pain, muscle aches from viral illnesses and simple headaches. However, they are prone to causing stomach upsets, ulcers and kidney dysfunction. They can also exacerbate asthma in some patients. It should be taken in conjunction with paracetamol as they have a synergistic effect. Because of its gastrointestinal side effects, it is recommended to be taken after meals.

When paracetamol and NSAIDs are ineffective at easing the pain, a weak, oral opioid such as codeine or tramadol is added in. These medications are powerful, but often have undesirable side effects such as nausea and vomiting, constipation, confusion and agitation.

As we step up the ladder, we introduce stronger opioids. This includes oral options such as sevredol and oxycodone, to intravenous options such as IV morphine and fentanyl. As effective as these medicines are, they must be used with caution given the significant adverse effects such as opioid narcosis, where a patient can stop breathing or enter a coma.

Other than opioids, there are various other options of pain relief that may be explored as adjuncts. Neuropathic pain from nerve damage is notorious for being opioid-resistant, so medications such as gabapentin or tricyclic acids (traditionally an antidepressant) may be used. Ketamine is sometimes used as it has analgesic properties. A PCA (patient-controlled analgesia) pump with morphine or fentanyl may be more effective to optimise the timing of doses. Long-acting opioids such as methadone may be considered. Lastly, nerve blocks with local anaesthesia, such as epidurals, are often used in conjunction to reduce the need for opioids.

Pain is an extremely useful evolutionary tool as it allows as us to avoid harm, but it can create just as many problems. The analgesic ladder helps health professionals better manage pain so that patients do not have to suffer as much while they are being investigated and treated.

image
Posted in Psychology & Medicine

Confirmation Bias

We hate to be wrong. When our beliefs and ideas and knowledge are challenged, we have a strong tendency to become aggressively defensive, going as far as attacking the other person personally. It is extremely difficult trying to change someone’s opinion, because of this strong bias towards our own thoughts. This is confirmation bias.

The problem with confirmation bias is that it creates a vicious cycle, causing us to become more and more rigid in our thinking. Not only do we refuse to change our position when challenged by someone else, we actively seek out proof that we are right.

When we read or hear news or a fact, our brain has a tendency to automatically colour it according to our own beliefs. If it aligns with our beliefs, then we take it as concrete proof that we are right. If it goes against our views, we work hard to prove that there are flaws in the article, such as claiming that the writer is biased, or blatantly ignoring it, while demanding better evidence.

Social psychologist Jonathan Haidt eloquently describes this phenomenon into two questions.
When we like the proposition or fact, we ask: “Can I believe this?”. If there is even a single plausible reason, we give ourselves permission to believe it, as it reinforces our views.
However, when we don’t like it, we ask: “Must I believe this?”. Even a single, minor flaw is enough for us to discredit the new information.

This gross bias results in the difficulty of our brain to consider alternative points of view. Furthermore, we now live in the Information Era where abundant information is freely available, meaning that we can easily search up numerous other opinions that align with ours, even if the majority consensus is against us. We choose only to discuss the idea deeply with people who think like us, while fighting tooth and nail against others.

How do we overcome this incredible barrier? Like most cognitive biases, we cannot simply switch it off.

Perhaps the first step is acknowledging that we are very flawed beings that are prone to being wrong.

Then, we can catch ourselves asking “can I” versus “must I”. If we catch ourselves saying “must I believe it?”, then we should become critical of our own thinking and ask ourselves how we would respond if we instead asked the question “can I believe it?”.

At the same time, try to notice when other people are showing confirmation bias. Then, realise that is exactly how ignorant and obtuse you sound when voicing your own confirmation bias.

Finally, remember that it is okay to be wrong. If we never made any mistakes, then we would never grow. How boring would that world be?

Posted in Psychology & Medicine

Set In Stone

The Pont des Arts bridge in Paris is famous for being the site of “love locks”. Since 2008, tourists in love have been attaching padlocks inscribed with their names on the railings of the bridge. Millions of such locks have since been placed on the bridge, promising eternal love between the couple. Within 6 years, the total weight of the locks was already starting to cause structural damage to the bridge, with sections collapsing into the Seine River. In 2015, the locks were removed to conserve the historical site, but love locks continue to plague various historical sites and tall places around the globe.

People love to leave a mark. Whether it be a “Steve was here” on a wall or an “Alice + Bob” surrounded by a heart on a tree, graffiti has existed since ancient Greece. But why? What is the psychology behind couples wanting to immortalise their love in a lock, or people carving their names into wood or stone?

Perhaps it is because we know how fragile everything in life is. Life is full of uncertainties. We may die at any given moment. What we think of as true, eternal love may shatter as a result of our impulses or fade away with time. Even our identities and sense of self are unstable, for we do not really know who we are. 

This uncertainty scares us. We feel insecure that the things we love and make us happy can disappear. So to soothe ourselves, we obsess over the idea of permanence. Because our love, our lives and our identities are intangible, we write our names into something that is tangible and (perceived to be) permanent.

But nothing is permanent. Bridges fall and walls crumble. A metal lock will do nothing to eternalise your love other than making you feel slightly secure for a moment. Instead, we should embrace the concept of impermanence

By accepting that nothing is permanent, we can be more grateful for the transient moments of happiness and beauty in life, enjoying the present rather than trying to preserve the future.

Posted in Psychology & Medicine

Proust Effect

In his novel In Search of Lost Time, French writer Marcel Proust explored the power of smell in invoking memories. He tells a story of how he would have tea-soaked madeleine to trigger memories from his childhood. Proust called these memories involuntary memories, because it is not recalled on purpose, but automatically triggered by a sensory stimulus such as smell.

Our brain processes memory in a strange, abstract way. Because it doesn’t record memories like a photograph or video, memories become unreliable the older they are. We have very limited memories of our childhood, unless they are paired with specific emotions or memorable events.

Smell triggers involuntary memories because the part of the brain that senses smell, the olfactory bulb, lies right next to the hippocampus and amygdala. These sections of the brain handle memory and emotion respectively, so there is a theory that we form memories linked to different smells, especially if it is an emotional one. There is also some research to suggest a phenomenon called reminiscence bump, where we have a tendency to recall more triggered memories from adolescence and early adulthood. This may be because these are the years when we form our self-identity.

This may be why smells of certain dishes or baking may act as powerful mediums to recall treasured childhood memories, such as the love we received from our parents. Even as adults, we all have specific dishes that we crave to comfort us when we are feeling stressed or lonely. More often than not, these dishes will have a story behind them, whether you remember it consciously or not. When we smell the dish being prepared, we become drowned in nostalgia. The emotions of happiness, safety and love linked to these memories distract us from the pains of life for just long enough that we can have the strength to make it through another day.

Proust talked about a tea-soaked madeleine being his key to his memories. What food is the proverbial madeleine to you?

What food triggers your nostalgia?

Posted in Psychology & Medicine

The Importance Of Television

Fire is considered one of the most important discoveries in the history of our species. Since the dawn of time, it has provided us with warmth, light, cooked food and the power to invent even more things.

We can see how important fire was to our ancestors from how integral it was within a house.
In prehistoric times, there would always be a fire at the centre of a cave or hut, where the family could gather around for warmth and light. Here, they would warm themselves on a cold winter’s day and cook meat that they hunted during the day to tenderise it.

Unlike the old days, we no longer have open fires in the house. Instead, fire has been split into three different forms.

  • Instead of huddling around an open fire for warmth, we have boilers and hot water cylinders to warm our houses.
  • Instead of cooking our food over a campfire, we have gas or electric stoves and ovens.
  • Instead of the flickering flames providing us with light and distraction, we have television and computers.

Of course, we still have fireplaces, barbeques and candles, but the modern person tends to rely more on modernised versions of fire.

An interesting takeaway from this theory is how television is the modern form of the psychological comfort that fire provided us. In prehistoric times, people would struggle to stay alive, running from predators and hunting to feed the family. Looking at the fire mindlessly at the end of a hard day’s work would have been a way to destress and unwind.

Nowadays, most of us are lucky enough to not have to fear death on a day-to-day basis, but we still suffer constant stress from the busy modern life. Perhaps sitting in front of a television or computer to procrastinate for half an hour is not the worst thing in the world.

That said, everything should be done in moderation. It is good to relax for a set amount of time, but if you spent every evening after work staring at a screen without an original thought, your mind will dull and atrophy.

So, it is good to balance out the mindless entertainment such as comedy or reality shows with films that provoke thoughts and emotions, documentaries that provide you with knowledge, and shows that stimulate your creativity.

Most importantly, what you think and feel and learn after watching these should act as fodder for conversations that help deepen your connection with other people.

Posted in Psychology & Medicine

Voodoo Death

We inherently fear death. Much of what we do biologically is struggling against death. We eat and drink to sustain ourselves. We feel pain to avoid things that may eventually kill us. Even moments before our death, our brain will flash our life before our eyes to grasp at any past experiences that may help us survive.

Because of this, we are also inherently neurotic. Some fear flying in a plane because they can imagine the plane crashing and burning, even knowing that flying is safer than a car ride. Childhood traumas where we thought we might die cause long-lasting damage to how we behave and think as an adult.

The most interesting example of how the fear of death can affect us is the phenomenon of voodoo death.

American physiologist Walter Cannon published a paper in 1942 studying cases of “voodoo death” – where healthy people (usually from tribal societies) suddenly passed away after being cursed. Voodoo death starts when a person is cursed or condemned to die by a medical person, such as a witch doctor or shaman. The victim and those around them must believe that the curse will actuall kill them (due to the culture or tradition). The victim’s family may even prepare a funeral. The victim loses all hope that they can survive the curse. They then die, even though their body shows no signs of physical ailment.

For example, the Australian Aborigines are known to have practised “boning”, where a witch doctor would point a vexed bone at an enemy, causing the victim to immediately convulse and die. A Nairobi woman passed away within 24 hours of finding out that the fruit she ate was sacred and she committed a great sin. A Maori man, who was told he should never eat wild game meat, died a day after finding out that he had accidentally eaten wild game meat – even though he had eaten it 2 years ago.

Voodoo death is not only limited to pre-modern societies. In the 1990’s, there was a documented case where a patient was diagnosed with terminal metastatic oesophageal cancer. After saying his goodbyes to his family as were his last wishes, he swiftly passed away. On autopsy, they discovered that the cancer had not actually spread that much and was not the cause of death.

There are many theories as to what may cause voodoo death. The traditional thought was that intense fear and stress stimulates the release of catecholamines such as adrenaline, inducing a massive fight-or-flight response, as seen in broken heart syndrome. The surge of adrenaline causes the heart to beat too fast and too strongly, until it gives out and causes cardiac arrest.

However, more recent studies showed that animals that die from stress exhibit signs of the opposite happening – that is, the parasympathetic nervous system (responsible for the common type of fainting spells called vasovagal syncope) is overactive. Because the parasympathetic nervous system has the opposite effect to the sympathetic (fight-or-flight), it can cause the heart to slow to the point of stopping.

This parasympathetic overactivity may be triggered by a sense of absolute hopelessness, essentially causing the body to “give up” on life. On a related note, the hopeless victim will likely not be eating or drinking much while under extreme emotional duress, so dehydration and catatonia may play a role as well.

Voodoo death is an excellent example of how much power the mind has over the body. Ironically, the fear of death itself can cause death.