Are you seeing what is there?

Before we start, have a look at this square.

Screen Shot 2017-12-13 at 17.56.33.png

What colour is it? We’ll come back to this later.

In the late 1950’s a psychologist named Milton Rokeach was inspired by an article about two women, both delusionally believing they were the Virgin Mary being housed together. He decided to replicate this and arranged for three men, who believed they were Christ, to be housed together in Ypsilanti State Hospital in Michigan, USA, in what became known as the Ypsilanti experiment. But far from the stormy encounters he expected, these men got along fine, each coming up with complex and often absurd explanations for the beliefs of the other two (with one patient, Joseph Cassel, accurately observing that the other two ‘were insane and belonged in a mental institution’) whilst confidently maintaining their own delusions. The story was detailed in a book The Three Christs of Ypsilanti by Milton Rokeach in 1964.

This classic psychology study is bizarre but anecdotal. Clearly, these men were delusional, and when confronted with evidence they refused to accept the obvious reality but instead twisted everything around to keep the delusion intact at all costs. The question is, though, how much do we do the same every day? Few of us claim to be the Messiah, but when it comes to protecting our beliefs, decisions and first impressions, psychological research is pretty clear: none of us is much better than the men in Ypsilanti hospital.

As based in rationality and Bayesian-reasoning as the scientific and medical worlds are, us human actors within them are still human and carry these behaviours along with us. Alas, those of us educated in these fields seem to be very capable of rationalising cognitive biases, actually believing they are the result of a logical thought process. This often manifests as confirmation bias, with potentially serious implications for the patient. In taking medical histories, doctors often ask questions that solicit information confirming early judgments. Have you ever listened to a patient tell you the first few symptoms, make a diagnosis in your mind, and decide on a treatment before they had finished? Medical staff have often been found to stop asking symptomatic or historical questions because they reach an early conclusion, thus failing to unearth key information. This form of confirmation reinforcement is known as anchoring bias, once the human mind has made a decision we are very reluctant to reject it for another (Wallsten, 1981; Larue, 1995).

So how can we overcome this?

Well, it’s not easy, and there is no perfect solution. But Klein’s extensive review provides a few pointers:

1). Be aware of base rates.

2). Consider whether information is truly relevant, rather than just salient.

3). Seek reasons why your decisions may be wrong and entertain alternative

hypotheses.

4). Ask questions that would disprove, rather than confirm, your current

hypothesis.

5). Remember that you are wrong more often than you think.

(Klein, 2005)

In many areas of research into overcoming biases and avoiding simple errors the act of writing presenting factors down, or working through checklists, repeatedly comes out as a significant factor. This is common practice in roles such as pilots, military personnel and nuclear reactors control, and adopting a similar practice in surgery has halved the error rate (Cohen, 2003).

But, of course, you listen to your patients and see the world as it is. You are not insane and, unlike the men in Ypsilanti hospital, what you witness is objective and real. Well, remember at the start I asked you to look at this?

Screen Shot 2017-12-13 at 17.56.33.png

And I asked you to keep in mind what colour it was? Obviously, it’s yellow.

Except, of course, it isn’t. It isn’t if you are reading this post on a computer screen or your phone, tablet, or television. In fact, you’ve never seen yellow light on any of these devices. You see, screens do not emit yellow light; they are made up of red, green and blue elements that emit light at these wavelengths in varying relative amounts, referred to as RGB. The light hits specific proteins in the cone cells of your retina, themselves only responding to certain wavelengths, sending a signal to your visual cortex that decides that a certain pattern of RGB light equals yellow, and tells your conscious mind that’s what colour it is. A more detailed explanation of this phenomenon can be seen here or with white light here.

If your brain is lying to you constantly about something as simple as what colour you are looking at, how can you be sure that your first impression about that patient you saw or that theory you believe in is correct? The answer is that you can’t. Every one of us is in thrall to classical confirmation, recall and anchoring Biases.

And if you’re still thinking this doesn’t describe you, then scroll up to those squares. They’re still yellow, aren’t they?

 

Wallsten TS. Physician and medical student bias in evaluating diagnostic information. Med Decis Making 1981;1: 145-64. 

Klein JG. Five pitfalls in decisions about diagnosis and prescribing. BMJ. 2005;330: 781–783

Larue F, Colleau SM, Fontaine A, Brasseur L. Oncologists and primary care physicians' attitudes toward pain control and morphine prescribing in France.

Cancer 1995;76: 2375-2382

Cohen BJ. Theory and Practice of Psychiatry. 2003; Oxford University Press, 1st Ed. ISBN-13: 978-0195149388

Muller S & Patel HRH. Safe surgery checklists: lessons learned from the aviation industry. Journal of Surgical Simulation. 2013;1:1–4

BBC News, 14th Jan, 2009; http://news.bbc.co.uk/1/hi/health/7825780.stm