Posted in Psychology & Medicine

Egg Of Columbus

After returning to Spain after his discovery of the New World, Christopher Columbus was dining with some nobles. One noble approached him and said:

“Even if you had not discovered the West Indies, another fine Spaniard would have gone to discover it anyway.”

Columbus did not respond and merely smiled. He then asked for an egg, which he placed on the table and asked:

“I bet that no one can make this egg stand by itself.”

All the nobles tried but were unsuccessful and the egg would continue to fall down. Columbus stepped forward and grabbed the egg, which he tapped on the table so that one end would be cracked and flattened. The egg would now stand on its flattened base.
Although the nobles initially complained that they knew that was the solution, the message was loud and clear: once the feat is done, everyone knows how to do it.

This is known in psychology as the historian’s fallacy – a logical fallacy that can be summarised in the words: “I told you so”. Essentially, people assume that people had the same information in the past or that they would not have made the same mistake if they were placed in such a situation. It is another example of cognitive dissonance where the brain finds conflict between a problem and information that could have prevented said problem (which the other person did not have at the time). Therefore, the brain immediately convinces itself that it would have made the right decision as it already knows the answer. This means that we are almost incapable of putting ourselves in other people’s shoes. We label those people as idiots, because they apparently had the same information (they did not) and still could not make the right decision.

People never realise that given the foreknowledge we have now, the Americans would have known about Japan’s plan for attacking Pearl Harbour or that Germany would not have invaded Russia. Although they say “those who cannot remember the past are condemned to repeat it”, we have a tendency to think that people in the past were stupid and we would never make the same mistakes.

Hindsight is 20/20.

Posted in Science & Nature

Murphy’s Law

In 1947, an aerospace engineer named Edward A. Murphy Jr was involved in high-speed rocket sled experiments led by the US Air Force. The aim of the experiment was to research the effect of sudden deceleration on the human body so to improve the safety of jet fighter pilots. To study this, a flight surgeon named Dr John Stapp devised a “sled” attached to a rocket that could be used on a long track. The rocket would propel the sled to a massive speed and brakes would induce as sudden deceleration. However, they found that the machines that were used to measure the G-force (force of deceleration relative to the force of gravity) were unreliable. Murphy proposed that they use electronic strain gauges attached to the harness of the test subject to measure the G-force, something he learned while working with centrifuges.

The idea was great but there was one problem: the gear kept failing, showing no reading whatsoever. Murphy soon found that the sensors were attached correctly but were wired backwards. This simple mistake frustrated Murphy, who blamed the incompetency of his assistant, stating that “if that guy has any way of making a mistake, he will.” This became the famous Murphy’s law, now simplified to “Anything that can go wrong will go wrong”.

Murphy’s law actually played a fundamental role in defensive design, where the worst-case scenario is always assumed and prepared for. Thanks to this system, the rocket sled experiment was successful and in 1954 Dr Stapp became the fastest man in the world – travelling at a speed of 1011km per hour and decelerating at a force of 46G (it was hypothesised that a human being could not survive past 18G). Not only did he survive (albeit with broken limbs, ribs, hernias, detached retina and temporary blindness), Dr Stapp went to build bigger rockets to further test the limits of the human body.

Interestingly, there’s another side to the Murphy’s law involving psychology. People suffer from a fallacy called appeal to probability, where they believe that because there is a possibility of something can happen, it will happen. The brain is surprisingly inefficient in dealing with probabilities and has a tendency to ignore that there is a relatively miniscule possibility and instead focuses on the absolute fact that there “is” a probability. This is the best explanation for why people are compelled to buy lottery tickets and why every student believes they will grow up to be rich and successful. 

Posted in Psychology & Medicine

Werther Effect

Suicide is the third leading cause of death among young people in the developed world. Every year, approximately one million people take their own lives – far greater than the number of people dying from liver disease, Parkinson’s or even homicide. Despite being one of the most preventable causes of death, suicide still plagues society.
Among the many factors contributing to suicide attempts (mental disorder being the major one), one of the more interesting one is mass media. The effect of mass media on suicide rates can be traced back as far as 1774.

In 1774, Goethe wrote a novel called The Sorrows of Young Werther, where the hero shoots himself after an ill-fated love affair. Shortly after publication, there were many reports of young men who used the same method as Werther to commit suicide. There were even reports of people dressing up like Werther (yellow pants and blue jacket) or leaving the book open to the passage detailing his death next to themselves. After this event, the book was temporarily banned to stop the “epidemic”. Since then, the phenomenon of copycat suicides has been called the Werther effect.

The human brain is trained to think about the information it receives. This applies to suicide as well and people with mental disorders such as depression and bipolar disorder are more prone to suicidal thoughts after hearing stories about it. This effect is amplified by the media tending to glorify or beautify such deaths (as the subject tends to be a celebrity or a fictional character), causing some people to subconsciously believe that suicide is acceptable. In essence, the Werther effect is a form of peer pressure where cognitive dissonance lead people to act irrationally because others in society appear to be doing the same thing.

The Werther effect is surprisingly effective in predicting an increase in suicide attempts after the publication of news regarding suicide. On April 8, 1986, a Japanese singer called Yukko Okada, only 18 at the time, committed suicide by jumping off the seventh floor of her recording studio. Her popularity meant the media were over the story like hungry wolves, reporting the tragic death in every form possible. Within two weeks, 33 young people (including one nine-year old) killed themselves – 21 by jumping from buildings. This episode was dubbed Yukko Syndrome and is one of the most famous cases of the Werther effect in modern society.

Just like in the original case of the Werther effect, the suicide could be fictional and still cause an increase in suicide rates. There was a German television show called Death of a Student that depicted a railway suicide of a young man at the start of every episode. After it began airing, railway suicides by teenage males increased by 175% in Germany. Curiously, there was no increase or decrease in suicide rates via other methods, suggesting that the Werther effect not only affects the choice of method, but also induces suicidal thoughts in those who did not plan on killing themselves.

In 1987, a campaign in Vienna to inform reporters about the Werther effect and the role of the media in suicides led to a dramatic drop in reporting suicides. This was followed by an 80% drop in subway suicide and non-fatal attempts, along with a decrease in the total number of suicides.

The Werther effect is a fine example of how words can kill.

Posted in Psychology & Medicine

Connoisseur

On May 24, 1976, a British wine merchant called Steven Spurrier organised a wine competition to determine the top wine from different areas of France and California. The panel of French judges were all wine connoisseurs who would blind taste the wines to give an objective rating. The event, which would later be called the Judgement of Paris, was a turning point in wine history and also shows a fascinating point regarding the arts.

It was predicted by every judge (including Spurrier himself) that the French wine would trump the Californian wine in every field. For how could Californian wine – with only a history of a century or so – beat top-quality, traditional wine from France, famous for its wine since 6th century BC? Even after the tastings, the judges were confident that the wine that they gave the top rating was indubitably French. Unfortunately, they were wrong.

Californian wine were rated best for both red and white wine, critically damaging the reputation of French wine and the validity of wine tasting (even after several complaints, adjustments and re-testing, Californian wine still came out top).

People believed that French wine would be better quality because of the stereotype that French wine is the best. The experiment  showed that there is no real basis for such a stereotype. Therefore, the real reason people pay more for wine from French vineyards is not because it tastes better, but because they want to appear classy and well-cultured. It is possible this also applies to the price of the wine – where people buy more expensive wine believing that it must be better than the wine that is $5 cheaper.

Another experiment highlights how the taste of wine can be affected by classiness. It has been scientifically shown that people buy more expensive wine in supermarkets if there is classical music playing compared to any other genre. The classical music gives an air of high class, leading the person to make their wine choice accordingly.

The same phenomenon is found with art. There have been numerous cases where art critics acclaimed a piece of abstract art, believing the artist to be the next Jackson Pollock, until they found out it was drawn by a 2-year old child or an elephant.

In short, high class is a completely subjective term with absolutely no practical value – other than giving the person a false, pompous feeling of superiority. What matters in art is not whether it is “good” or not, but whether you enjoy it or not.

Posted in Psychology & Medicine

Monkeynomics

Money is without a doubt a human invention. There are no recorded cases of any animal using an inanimate object to standardise the value of items and establish a non-bartering economy. Since childhood we learn of the value of money and how it can be used to purchase goods and services. In fact, money can be considered one of the fundamental pillars of human society that makes the world go round.

However, scientists have discovered that a currency system may not be such a novel system after all. In an experiment, a group of capuchin monkeys were given silver disks and were shown how they could use the disks as payment for a treat. Within a few months time, the monkeys realised that the disks had inherent value and began acting just like humans with money.

For example, they did not act in the standard way of operant conditioning (i.e. performing an action results in a reward), but responded to market forces in an accurate manner. If the “price” rose, their demand for treats would fall (i.e. buy less) and vice versa – following the law of demand that modern economics is based on. They even learnt to save the “money” to afford treats.
In a similar experiment with chimpanzees, it was found that chimps were even quicker in learning the concept of money and even learnt how to use smaller denominations of currency. 

Things started to get interesting when a certain monkey sneaked into the chamber storing the “coins” and threw it into the communal cage, quickly escaping before the researchers came back. This was the first recorded case of a monkey bank robbery
While this was happening, it was also observed that one male monkey was giving a female monkey a coin. The researchers wondered if this was an act of altruism or wooing, but soon discovered that the female monkey would receive the coin then have sex with the male, then later use the coin to buy food. The introduction of money immediately led to the invention of prostitution.

Posted in Psychology & Medicine

Korsakoff’s Syndrome

It is a well-known fact that excessive drinking leads to a so-called “blackout”. This form of memory loss is common in normal people and cannot be seen as a major illness. However, there is another disease that can be caused by excessive drinking called Korsakoff’s syndrome. Strictly speaking, this is not caused by alcohol but due to a thiamine (vitamin B1) deficiency and is commonly found in alcoholics and malnourished patients (it has also been reported to be caused by mercury poisoning and after centipede bites in Japan).

The six characteristic symptoms of this syndrome are: anterograde (cannot form new memories) and retrograde (cannot remember old memories) amnesia, confabulation, lack of detail in conversation, lack of insight and apathy.

Korsakoff’s syndrome patients show a very peculiar behaviour. As stated before they suffer from both anterograde and retrograde amnesia so not only can they not remember the past but they cannot make new memories either. Ergo, the brain uses information from its surroundings and attempts to recreate the lost memories, the result being confabulation. Confabulation is essentially what happens when the brain tries to fill in blanks in memories with false information. Confabulation is seen in everyday life too with healthy people but in the case of Korsakoff’s patients the effects are significantly more profound. For example, if you ask a patient what she did yesterday, she may look at your horse-print tie and claim she was horse-riding. If you ask the same question an hour later without your tie and instead holding a book with a photo of a Ferris wheel on the cover, she’ll state that she was at the amusement park. As one of the leading causes of amnesia and confabulation, Korsakoff’s should be suspected in any alcoholic or very underweight patient who keeps changing their stories around. 

As previously explained, the disease is caused by thiamine deficiency – therefore, the treatment is administering thiamine. But if the syndrome has persisted for a long time, the brain injury may be permanent. Also, treating the underlying alcoholism and malnutrition is important. 

If the thiamine deficiency is prolonged, it may lead to another disease called Wernicke’s encephalopathy. This is known as Wernicke-Korsakoff’s syndrome and in addition to the above symptoms, the patient may also experience confusion, tremors, nystagmus, paralysis of eye muscles, ataxia, coma and can eventually lead to death. All because of a deficiency of a single vitamin.

Who said nutrition is not important?

image

(NB: Dory from Finding Nemo is one of the most accurate portrayals of amnesia in films)

Posted in Psychology & Medicine

Zombie

Clairvius Narcisse died in Haiti on May 2, 1962. In 1980, he returned to his hometown. Alive.
How did a man who was dead and buried come back to life?

According to Clairvius, he was cursed by a bokor (sorcerer) to become a zombie but returned home after the curse was undone. The sorcerer had enslaved him in a sugar plantation for 16 years and many others were working as “zombie slaves” until they revolted, killed the sorcerer then ran away.
Harvard ethnobotanist Wade Davis studied and investigated this case extensively. According to his research, most “zombies” were placed in suspended animation to fake death and were then (often after being buried) put under psychosis by the sorcerer. Many Haitians believe in the ancient African religion of voodoo, where one legend says that when a sorcerer curses a person, they are revived after death to become the sorcerer’s slave. Thus, Haitians strongly believe in the legend of zombies. In reality, the sorcerer was using drugs to zombify people and Davis used his expert knowledge in botany to deduce what the chemicals were.

The so-called zombie powder was a combination of tetrodotoxin (TTX, blowfish poison) and datura (from the poisonous plant Datura stramonium). The TTX simulates death due to its paralytic effect and datura is a powerful hallucinogenic that causes the person to confuse reality and fantasy (dissociation). Also, it may cause memory loss which allows the sorcerer to easily manipulate the victim. Long-term maintenance of the datura dose could allow the sorcerer to enslave someone for a long period of time. However, the zombification is not the same as perfect mind control and more like a strong hallucination or hypnosis (as seen as the above mentioned revolution).

As it involves the handling of poisons, only an experienced sorcerer could give the right mixture of doses while avoiding the lethal dose. Although science has advanced greatly, there are still many things we can learn from magic and sorcery. The reason being, magic and sorcery are simply undiscovered science.

Posted in Psychology & Medicine

Pygmalion Effect

In 1968, Robert Rosenthal, a social psychology professor at Harvard University, and Lenore Jacobson, a primary school principal with 20 years experience, performed a spontaneous intelligence test on a primary school in San Francisco then randomly chose 20% of the students in one class. They then gave a list of the names of those students to the teachers and convinced them that they were “students with a high possibility of improving their intelligence and career success”. Eight months later, they performed the same intelligence test and found that the students on that list performed significantly better than other students on average. Not only that, but the score for the whole school was pulled up by those students. The most important factor was the expectations and encouragement from the teachers. This study proved that the expectations a teacher places on their students has a real effect on improving their grades.

The Pygmalion effect can be summarised as the phenomenon when a person’s efficiency or results improve due to the expectations and interests of another person.
The eponymous story is from Greek mythologies, regarding a sculptor named Pygmalion. After seeing many women be so immoral and vulgar, he could not see beauty in any women anymore. This led him to sculpt the most beautiful woman out of ivory instead. After finishing his sculpture, he gazed upon its face and instantly fell head over heels for it. Every day Pygmalion would caress, stroke and truly love “her”. However, being a statue it could not return his love and he grew sadder and sadder. He went to the Temple of Aphrodite and begged her to help him achieve his true love. Upon returning home, he kissed and touched the sculpture like any other day. And lo and behold, every part of the sculpture his hands touched turned from hard ivory to soft, clear skin and the sculpture eventually turned into a gorgeous lady. Thus, thanks to Aphrodite’s grace the two could live happily ever after in love.

The Pygmalion effect is extremely useful in everyday life. When parents and teachers believe that a child has talent, they spend more effort trying to grow that talent and the child ends up more successful. The simple task of showing interest to the child promotes optimism and the child works harder to meet those expectations. The child did not receive any extra compliments or rewards but their efficiency goes up regardless, thanks to their parents’ and teachers’ beliefs. 
Similarly, when a boss shows passion towards and has great expectations of an employee, their efficiency will go up. The Pygmalion effect is particularly powerful in relationships, where if the two love each other and are good to each other, their love will naturally deepen and they will become happier. 

Unfortunately, people have a tendency to underestimate the power of love and are unable to utilise this great effect. Therefore, children and employees are often plagued by the golem effect (the phenomenon of low expectations causing a fall in efficiency) instead.

image

Posted in Philosophy

Prisoner’s Dilemma

The prisoner’s dilemma is a famous example of how game theory functions. It predicts the behaviour of two people when forced to cooperate. The story goes as follows:

Two accomplices in crime are arrested by the police. They are interrogated in separate rooms. As the police have insufficient information, they offer a deal to each prisoner to confess that the two committed a crime (or deny). The deal is:

  • If you confess and your partner denies taking part in the crime, you go free and your partner will serve ten years (and vice versa).
  • If you both confess you will go to prison for four years each.
  • If you both deny taking part in the crime, you both go to prison for two years.

Assuming the prisoners act rationally (i.e. for their best interest and minimising their jail time), the prisoner will obviously choose the “confess” option as this is hypothetically the best choice (minimum time = 0 years, compared to only 2 years minimum for denying). However, because both prisoners are thinking this, the result is almost always that both confess and end up with four years each. Therefore, because human beings are unable to trust another human being enough, people always end up acting irrationally (benefit not maximised).
If the two had been trusting (assuming the other would deny too) and cooperated, both would have served half the time. But people always assume (correctly) that the other person will betray them for their selfish gain and this win-win result is unattainable.

But what if the other prisoner was yourself? Let us assume that the prisoner’s dilemma game was played by you and an exact copy of you. A copy that thinks like you, acts like you and identical to you in every single way. Can you trust yourself? Do you trust yourself enough to deny the crime, when it is entirely possible that he or she rats you out to walk free while you suffer for 10 years? How do you know that he loves you more than himself? 

Your greatest enemy is you.

Posted in Psychology & Medicine

Cuteness

The word cute is used in many different contexts: a girl saying a guy is “cute” could mean that she finds him attractive, while a guy saying a girl is “cute” may imply that they find them lovable but not in an attractive way. But essentially, cuteness can be described using the concept of neoteny.

What is neoteny? This is a concept in evolution where adults of a species preserve traits of youth. The end result is a mature organism that appears to be immature. A good example of this is the axolotl, which preserves its juvenile aquatic form (e.g. gills and overall look) in adulthood, even when they are living amphibiously. 
It has been hypothesised that human beings originate from neotenised chimpanzees, as a baby chimp has striking resemblance to a human. 

Cuteness and neoteny have an extremely intricate relationship. It is common knowledge that the most powerful attractors of care from an adult is cuteness. Almost every infant organisms have “cute” appearances that make people instantly feel warm and fuzzy. Ergo, being cute (i.e. neoteny) is a survival advantage as the young are cared for more until they are mature. This simple concept has led to the lengthening of childhood in humans, as children require a long time under the care of adults while they absorb knowledge and learn how to function in society. This also solved the problem of babies being born with immature brains (as the head is already too large to fit through the birth canal) and still having a chance at survival.

It has been scientifically proven that people with cuter faces are seen in a more positive light, more likely to be hired and less likely to provoke aggression in violent people (the human brain is wired to inhibit aggressiveness when faced with cuteness, presumably an effort to reduce child abuse and improve survival). In short, cuteness invokes maternal or paternal love and causes a sudden want to protect the cute thing.

This leads to another advantage of cuteness: attractiveness. Although beauty and cuteness are almost diametrically opposed, many men (and women) find “cuteness” to be appealing in the opposite sex. This is likely related to the brain confusing parental love with romantic love. A youthful look is also associated with fertility, which greatly influences a man’s subconscious choice of a partner. However, it is also true that because of this effect a man may see a cute girl only as a “little sister” figure they need to protect, rather than a potential love interest.

So what makes for a cute person? As stated above, these are traits of neoteny, or in other words:

  • large eyes
  • small nose 
  • small jaw and teeth 
  • flattened and rounded face 
  • large brain/forehead (causing the eyes/nose/mouth to be lower on the face)
  • hairless face and body 
  • limbs shorter than torso length 
  • legs longer than arms
  • upright posture

These characteristics are commonly used in animations and cartoons to boost the audience’s affection towards the character. This is especially the case in Japan where the a cultural obsession with cuteness is clearly evident.