Posted in Psychology & Medicine

Agonal Breathing

When a person is on the verge of death, they may show a very strange pattern of breathing. They will begin gasping for breath, take deep laboured breaths, begin to make strange noises and possibly have some muscle jerks (which may look like a seizure). The breathing makes it look as if the person is taking a deep breath and sighing, while gasping every now and then irregularly. This is called agonal breathing and it is most likely caused by an oxygen-starved brain sending weak signals to try kick up the respiratory drive for more oxygen.

Agonal breathing is not uncommon in cases of cardiac arrest. It is important to note that agonal breathing is not an efficient form of breathing and thus it cannot be said that the victim is “breathing” when this occurs. Because it looks like the patient is taking deep breaths, bystanders may be fooled into thinking that they have been resuscitated and have begun breathing again. But this is not the case and the patient is still clinically dead. Ergo, one should not stop CPR even if the patient begins taking deep breaths and sighs. The presence of agonal breaths usually indicate a better outcome for the patient.

(Link to video examples of what agonal breathing looks like: http://emsbasics.com/2011/04/21/what-it-looks-like-agonal-respirations/)

Posted in Psychology & Medicine

Placebo Effect

A strange phenomenon found in medicine is the placebo effect, where a patient’s symptoms improve after being given a completely inert substance (like a sugar pill) under the guise of a medication. The placebo effect is not only limited to pills, but any procedure that is intended for a therapeutic purpose (but does not have any actual therapeutic value). It is believed that the placebo effect is a strong component in many forms of alternative medicine such as homeopathy and faith healing. The placebo effect has been proven to be effective in improving or even curing the symptoms of some diseases such as allergies, asthma, headaches, abdominal pains and even severe illnesses such as heart attacks and cancer. Placebos are particularly effective for psychological symptoms.

There has been much research to determine how the placebo effect works. The leading theories so far are that placebos act to relieve anxiety and condition the patient into a more positive mindset, reducing stress and boosting the body’s natural healing process. This would also explain why placebos are effective in pain relief as perceived pain is amplified by negative emotions. Cognitive dissonance may also play a role, where the patient’s mind believes that since it is receiving treatment, it must be getting “better”, producing a beneficial psychosomatic reaction. Essentially, fooling the mind to believe and expect that it will get better makes the patient actually feel better.

Research into the placebo effect has also revealed some bizarre characteristics of the effect. For example, it has been found that the placebo effect is stronger if there are more pills, the pill is larger, branded or generally looks fancier. Even colour plays a role, with blue pills acting better as depressants (“downers”) and red pills acting better as stimulants (“uppers”). Telling a patient that a placebo will have a certain effect boosts that effect. Human factors such as the doctor’s credibility and confidence or the patient’s expectations and culture are known to drastically change the efficacy of a placebo. What is weirder is that studies have shown that telling a patient that they are being prescribed a placebo will not affect its efficacy, as long as they are told that “it could help them”.

The placebo effect is a great example of how much influence our mind, beliefs and expectations have on our health and our lives. The more positive thoughts and beliefs we have, the healthier we become. The more negative we are, the less effective treatments become. In fact, the same pill that gives people the placebo effect can be used to increase pain and symptoms if it is described in a certain way. This is known as the nocebo effect – the opposite of the placebo effect.

image

Posted in Psychology & Medicine

Empty Nest Syndrome

When children grow up and learn to become independent, parents must let go of their children and allow them to fly free. However, it is the inevitable human condition that the parents will be saddened by this change. For many parents, the moving out of their children can lead to depression and a loss in purpose. This phenomenon has been named empty nest syndrome as it happens as the children leave the metaphorical nest that is home.

As obvious as it sounds, empty nest syndrome can have a serious effect on the parent’s well-being. Common symptoms include depression, loss of purpose, anxiety, stress and a feeling of rejection. The suffering parent continues to obsess whether they brought up their child in a way to prepare them for the big world. At the same time, they feel that they are losing their identity as a “parent” – something they may have defined themselves as while they were bringing up the child. They may also feel rejected as they may believe that the child “does not need them anymore”. It has been observed that mothers are more likely to suffer empty nest syndrome (occasionally, menopause may be a confounding factor). Other factors that contribute are parents who find change difficult, have an unstable relationship with their spouse or those with an unhealthy obsession with their children or with the idea of being a parent.

Empty nest syndrome is a natural part of parenthood, but it is important to know how to prevent it from becoming too severe. The best way to cope with this syndrome is to keep in touch with the children and accept that they are young adults who are moving on with their life. Not only that, but the parents must also recognise that a new era has begun for them as well. This is important as failure to do so will lead to the identity crisis mentioned above. A good way to remedy this is through discovering hobbies and interests while maintaining healthy social networks with other people. Essentially, the parents have to “begin a new life”, just like their children. It is also worth noting that it helps if the children recognise this as well and try to keep in touch with their parents to make sure they are coping well without them.

Posted in Psychology & Medicine

Urine

Despite the implied disgusting nature (especially smell) of urine, it is one of the most important types of “samples” used in medicine for diagnostic purposes. Like blood, urine can tell a lot about a person’s health and whether they have a certain disease or not.

One of the earliest recorded uses of urine as a medical test was for the detection of diabetes mellitus. People noticed that the urine of a diabetic would often smell quite sweet, and also taste sweet (it is uncertain how they came to test urine this way). This is because a diabetic has too much glucose (sugar) in their blood, causing it to spill over into the urine as the kidneys become saturated. In fact, the words diabetes mellitus stand for “passing through” (referring to the symptom of frequent urination) and “honey-sweet”. A completely unrelated disease called diabetes insipidus also causes frequent urination, but the urine does not taste sweet, hence “insipidus” (tasteless). This type of etymology is also seen in countries like Korea, China and Japan, where the word 당뇨(糖尿) literally stands for “sugar urine”. Although we no longer taste urine, it is still used to gauge the severity of diabetes by measuring the amount of protein in the urine (due to kidney damage).

There are many other tests one can do with urine to check for certain diseases. The chemical composition of urine tells us about the hydration status of a person, while giving away clues to diseases that cause electrolyte imbalance. It also gives some indication of how well the kidneys can do their job of concentrating urine. Certain markers such as white blood cells and bacteria in the urine can indicate a urinary tract infection. Antibodies in the urine can point towards a certain type of bacteria as the cause of a patient’s pneumonia, or whether a woman is pregnant (βhCG). Looking for proteins or sediments in the urine can be diagnostic of certain kidney diseases such as glomerulonephritis. Even rare diseases such as phaeochromocytomas can be diagnosed from the level of catecholamines in the urine (this is slightly too complex for our scopes).

A more interesting part of urinalysis is looking at the colour of the urine. Urine is usually a yellow colour, ranging in darkness depending on the concentration of urine. But when there are other things in the urine, the colour changes. Reddish urine suggests blood (which is not an indicator of kidney failure as TV shows say), which can be caused by trauma, UTIs, kidney stones or some other disease. Brown urine could be due to muscle breakdown somewhere in the body. Urine can appear very dark if the person has an illness called obstructive jaundice. Eating beetroots can cause your urine to turn bright red, while medications can change your urine colour from anywhere from red to orange to green. Murky or cloudy urine (with an offensive smell) may suggest a UTI.

Perhaps the most interesting urine colour known in medicine is purple. This unique colour is produced in a rare genetic disease called acute intermittent porphyria. If urine is collected from a patient suffering an attack of AIP (causes crippling abdominal pain) then left in the sun or under a UV light, it will turn purple due to certain proteins. Because of this, urine collected to test for AIP is wrapped in tinfoil before sending to the lab (where the chemicals are measured) to limit light exposure.

(Also read the article on how different colours of skin can be of diagnostic importance: http://jinavie.tumblr.com/post/32313894252/skin-colour)

Posted in Psychology & Medicine

Phineas Gage

On September 13, 1848, a 25-year-old foreman named Phineas P. Gage was working on a railroad with his work team. In an unfortunate turn of events, as he was using a tamping iron (large iron rod with a pointed end, measuring 3 feet 7 inches in length and 1.25 inches in diameter) to pack gunpowder into a hole, the powder detonated. The forceful explosion drove the metal pole skyward through Gage’s left cheek, ripped into his brain and exited through his skull, landing dozens of metres away. His workmates rushed to Gage’s assistance (who they presumed to be dead at the time of the accident), and to their surprise, found that he was still alive.

image

In fact, Phineas Gage spoke within a few minutes of the incident, walked without assistance and returned to his lodging in town without much difficulty – albeit with two gaping holes in his head, oozing blood and brain everywhere. He was immediately seen by a physician who remarked at his survival. In fact, it is reported that he was well enough to say: “Here is business enough for you” to the doctor. Another physician named Dr John Harlow took over the case, tended to the wound, fixed up the hole and recorded that he had no immediate neurological, cognitive or life-threatening symptoms.

By November, he was stable and strong enough to return to his home, along with the rod that nearly killed him. His family and friends welcomed him back and did not notice anything other than the scar left by the rod and the fact that his left eye was closed. But this was when things started to get interesting.

Over the following few months, Gage’s friends found him “no longer Gage”, stating that he was behaving very differently to the man who he was before the accident. Dr Harlow wrote that the balance between his “intellectual faculties and animal propensities” had seemingly been destroyed. Gage became more aggressive, inattentive, unable to keep a job, verbally abusive and sexually disinhibited. He would frequently swear using the most offensive profanities and would be as sexually suggestive as a March hare. How did the iron rod cause such a dramatic change in Gage’s personality?

image

Phineas Gage would go on to be one of the most famous patient case histories in the history of modern medicine. His case was the first to suggest some sort of link between the brain and personalities. Neurologists noted that the trauma and subsequent infection destroyed much of Gage’s left frontal lobe – the part of the brain that we now attribute to a person’s logical thinking, personality and executive functions. It is in essence the “seat of the mind”. Ergo, Gage’s loss of one of his frontal lobes meant that his control of bodily functions, movement and other important brain functions like memory were undisturbed, while his “higher thinking” was essentially destroyed (he was essentially lobotomised). This explains Dr Harlow’s observation of his “animal propensities”.

Thanks to this case, a great discussion was sparked and the idea that different parts of the brain govern different aspects of the mind was conceived. We are now able to localise almost exactly where the language area is, what part controls movement and how a certain piece of the brain converts short-term memory into long-term memory.

image

Posted in Psychology & Medicine

Bad Hair Day

Have you ever had a day (or in some people’s cases, their whole life) where you cannot help but think that everyone is judging you because your hair just does not look right? Almost everyone has at least one “bad hair day”, when they feel self-conscious about their appearance and how others in society perceive them. Depending on the person’s general confidence level and self-esteem, the effect of a bad hair day can range from being harmless to completely ruining someone’s mood.

A psychologist named Professor Marianne LaFrance at Yale University decided to study how physical appearance affects people’s feelings. She separated 120 volunteers into three groups. Group 1 was asked to recall a bad hair day, group 2 was told nothing (control group) and group 3 was asked to recall a day in which they had difficulty opening a package (bad experience unrelated to appearance). She then measured the change in mood among the participants to see how the memory of a bad hair day could affect mood and self-esteem. To no surprise, the results showed that those who recalled a bad hair day suffered from much lower self-esteem and mood. Group 1 felt less smart and confident compared to the other groups and felt “embarrassed” in general.

The reason for the drop in self-esteem is that we are socially educated to feel that we are judged on our appearance. We have an inherent belief that an untidy appearance will mean that others will judge us as being unorganised, unprofessional and not trustworthy. This applies to anything that might potentially affect our image, such as an embarrassing moment or an unsightly accident. We become fixated on this idea and shine a “social spotlight” on ourselves, thinking that any embarrassing moment for us will be instantly judged by those around us. In psychology, this is known as the spotlight effect and it can be quite a powerful effect.

But here is the kicker: nobody cares. We have a psychological tendency to overreact to such situations where a spotlight might be turned on us, when in truth, others do not notice it as much as we think they do. There have been many experiments (mostly involving university students) where surveys showed that fellow students barely paid attention to or had little recollection of another student’s embarrassing moments or dishevelled appearances. Although it may have been the most embarrassing moment in the person’s life, to other people, it is at best a comedic happening that fades away in their memories.

So the next time you feel that others are judging you and you feel the blinding spotlight on you, just remember: the greatest, and only important judge of your character, is yourself.

Posted in Psychology & Medicine

Skin Colour

The world is full of people of all creed and races and it is a common fact that people from certain races have different skin colours to people from other races. But other than the range of normal skin colours, there are certain skin colours that can occur with specific medical conditions.

The most common reason for a change in skin colour is a suntan, which damages the skin and causes darkening of the skin (hyperpigmentation). However, some diseases are also known to cause hyperpigmentation, such as Addison’s disease or haemochromatosis.

The converse is lightening of the skin (hypopigmentation) and can happen with diseases such as leprosy, vitiligo or albinism. Alternatively, people can look pale when they are anaemic or extremely frightened, triggering a sympathetic nervous response, shutting down blood circulation to the face and extremities.

It is common to see red skin with flushing, sunburns, skin infections or numerous dermatological conditions such as rashes. Occasionally, these rashes may be associated with serious diseases such as lupus or Crohn’s disease.

Cyanosis (literally “blueness” in Latin) causes the skin to bluish-purple and it is due to the lack of oxygen in the blood. This could be caused by any number of reasons that causes hypoxia. For example, babies can be born with a heart defect that causes mixing of oxygenated and deoxygenated blood, leading to something called “blue baby syndrome”.

Liver dysfunction can present as jaundice, which is yellowing of skin and the white of the eyes due to a build-up of bilirubin.

Some stranger skin changes can be caused by certain chemicals. Carrots contain beta-carotene (which gives carrots their orange colour) and excess consumption can cause carotenosis (or carotenodermia), a yellowing of the skin. Eating too many tomatoes causes a similar condition called lycopenodermia, which presents as reddened skin (lycopene gives tomatoes their red colour). A combination of the two produces a distinctively orange colour. Both conditions are harmless and disappear after reducing the amount of carrots and tomatoes eaten.

Even stranger still is a condition called argyria, which can be caused by exposure to silver, either through medications especially alternative medicine), mining or contamination of the water supply. Silver causes skin to turn a deep blue colour and the pigmentation is irreversible. Similarly, copper can turn skin green and gold can turn skin grey.

Posted in Psychology & Medicine

Child Prodigy

At the age of 6, Wolfgang Amadeus Mozart toured Europe to astound audiences with his mastery of the violin, organ and keyboard. At the age of 11, Judit Polgár defeated a Grandmaster in chess, later becoming a Grandmaster herself at the age of 15. By the time he finished elementary school, Saul Kripke had taught himself ancient Hebrew, finished the works of Shakespeare and mastered the works of Descartes and complex mathematical problems.

Each of these people is considered a child prodigy – person who develops and shows extreme talent in a skill at a level far beyond the norm for their age. The term wunderkind (German for “wonder child”) is also used. For some unexplained reason, these people are far beyond the average level of children at their age in terms of intelligence or a certain talent.

Prodigies are actually a subset of a condition known as precocity, where a young child shows unusually early development or maturity, especially in mental aptitude. For example, a German child called Christian Friedrich Heinecken is known to have talked within a few hours after his birth, learnt the key events of the first five books of the Torah within a year, mastered the Bible at age 2 and had a working knowledge of universal history and geography, Latin and French at age 3. Unfortunately, he was struck ill at the age of 4, and shortly after predicting his death, passed away. Heinecken’s case is an extreme example of precocity, but nonetheless most precocious children show at least an outstandingly advanced level of mental maturity compared to other children. Along with prodigies, savants and children with extraordinarily high IQ (over 160) are also considered precocious.

Although precocious children enjoy their extreme talent (for which they usually have deep passion for) and may even become famous for it like Mozart, they are almost always at risk of certain problems. One common issue is that they tend to be placed on pedestals as people constantly praise their ability. This can quickly evolve into narcissism, setting a major expectation that the child battles with throughout his or her life. Children with advanced intellect are often unable to fit in to society as they are far more intelligent than their peers. Not only do other children shy away from them, but they feel too bored and unstimulated by other children and choose to alienate themselves. Furthermore, although they may have the intelligence and maturity to comprehend philosophical concepts, they still have the emotions of a child, meaning they are tormented by the dissonance between the rational mind and their emotions. All of these factors combined lead to a great increase in risk of depression in precocious children.

Essentially, the main conundrum for child prodigies is trying to balance their amazing talent with a happy life in a “normal” society. This could be achieved by parents keeping things real and not placing excessive expectations on the child, and giving the child a way to vent their genius in some way. For example, chess has been a classic way of keeping children with high intellect engaged. Having this kind of vent allows the child to still engage with other members of his or her society (other children), while honing their great skills for an even brighter future. The child must stay engaged and passionately practise and advance their skill so that they do not stay in a perpetual rut all their life.

With great power, comes great responsibility.

Posted in Psychology & Medicine

Tetrachromacy

They say that human imagination is infinite and limitless. But consider this: can you imagine a colour outside of the visible spectrum? Most likely, you are incapable of thinking of a new colour that cannot be mapped on a standard colour chart. Interestingly, a small proportion of people can see and understand colours beyond the range that the majority of us can see.

The physiology of vision is rather complex, but essentially boils down to the retina (inside lining of the eyeball) acting as a film for the image that you see. Cells known as photoreceptors convert the visual image into electrical signals that are transmitted to the occipital lobe of the brain via the optic nerve. There are two types of photoreceptors: rod cells, which sense movement, and cone cells, which sense colour and provide sharp images (visual acuity). Human beings typically see colour by combining three primary colours: red, green and blue (known as the RGB system). There are cone cells for each primary colour. The brain processes the signals sent by each cone cells and figures out what “colour” you are seeing. Therefore, you can only perceive colours made from a combination of red, green and blue. It is easy to visualise this by playing with colour palettes on computer programs such as Photoshop.

In recent years, it has been speculated that a certain percentage of women have an extra type of cone cell that senses a different wavelength of light. Ergo, they can theoretically sense a greater range of colours compared to someone who has three types of cone cells. This condition is called tetrachromacy (“four colours”). Tetrachromacy is the opposite to colour blindness, which is caused by a deficiency or fault in one or two types of cone cells. To these people, the average person (a trichromat) will appear “colour blind”.

According to one estimate, as many as 12% of women are tetrachromats. Although there are many theoretical barriers to true tetrachromacy, there have been several documented cases of women who perceive colour in much more depth.

The ability to see an extra primary colour is more significant than just a 25% increase in the person’s colour range. An average person can see about 1 million different hues (shades of colours), while a true tetrachromat can see 100 million hues – a hundred-fold increase in the range of colours they can see. One can only wonder what kind of amazing sights a tetrachromat sees when she gazes upon a field of flowers or even a rainbow. Unfortunately, even if a tetrachromat tried to explain the colours she saw to us, we would not be able to grasp the colours as our minds would be incapable of visualising the colours, much like how describing the colour red to a blind person is impossible.

image
Posted in Psychology & Medicine

Berry Aneurysm

Stroke is a disease often associated with the elderly, but this is not necessarily true. As much as 5% of the population carry a ticking time bomb in their brain, known as a berry aneurysm. An aneurysm is a weakening of the arterial wall, causing a localised ballooning of the vessel. A berry aneurysm is a common type of aneurysm where the ballooning resembles a berry. What is most troubling is that a large proportion of these aneurysms can present very early (usually congenital, meaning you are born with it), with one research suggesting that 1.3% of the population in the age group of 20 to 39 has a berry aneurysm. If this berry aneurysm was to burst, no matter how young and fit you are, you will bleed into the area around your brain (subarachnoid haemorrhage), suddenly develop a severe, crippling headache (“thunderclap headache”), become confused, show signs of stroke such as speech or movement problems, or simply drop dead.

image

Fortunately, only 10% of people carrying a berry aneurysm suffer a ruptured aneurysm and subsequent brain bleed. The other 90% will carry on living their lives, without ever knowing that they had a time bomb in their brain.

Certain factors make the risk of the aneurysm bursting go up, such as high blood pressure, which can be caused by a stressful lifestyle or smoking. But in some cases, as explained above, even a healthy teenager could suddenly drop to the ground with a massive brain haemorrhage.

Berry aneurysms are only one of many ways death could strike unnoticed, no matter how young you may be. You could live a long and healthy life and die peacefully in your sleep when you are 90 years old, or you may have a stroke and drop dead in a few minutes’ time. For all you know, a bus might run you over tomorrow, with no warning whatsoever. Ergo, youth is not an excuse to waste the day you are given. You do not have to achieve something great, or be productive, but at least spend your day knowing that you are doing everything in your power to make yourself happy, without harming your health, your future or other people.

Carpe diem. Seize the day.