How much do you think you know?

One morning in Pittsburg, USA, in 1995 a man named McArthur Wheeler decided to rob some banks.  He was confident that he would escape detection as he had a secret ingredient: lemon juice!  He knew that lemon juice could be used to write invisible letters, which only become visible when the paper is exposed to a heat source, and so reasoned that by coating his face in lemon juice he would be invisible to the security cameras. He even tested his theory using an instant polaroid camera beforehand.  The camera returned a blank image, likely due to a malfunction, and so with this proof, he went out and robbed two local banks using no other disguise whatsoever.  The security footage was run on the news that night and he was, unsurprisingly, identified and arrested almost immediately.  When arrested he maintained that they could not have recognised from the footage, so perfect was his plan.

This story is related in the paper “Unskilled and unaware of it” by Cornell University psychologists Justin Kruger and David Dunning in 1999 (linked below).  They examined many cases of overconfidence in individuals who knew little or nothing about a topic, and coined the term “Dunning-Kruger effect" to describe the associated overconfidence.  We are all familiar with someone who is ignorant or unskilled on a topic refuses to listen to evidence or expert advice whilst providing those around them with the full benefit of their views.  It’s certainly easy to laugh at the story of the bank robber, but not one of us fails to fall into this same cognitive trap at some point.

The formal term may only have been coined sixteen years ago, but the phenomena has been recognised for a long time, with references going back to Confucius and Socrates, and has not escaped more modern thinkers, who have described it thus;

Ignorance more frequently begets confidence than does knowledge
— Charles Darwin, The Descent of Man.
The trouble with the world is that the stupid are cocksure and the intelligent are full of doubt
— Bertrand Russell, Marriage and Morals.

It isn’t hard to think of examples, from the journalist with no scientific training repeating there is no climate change to the loud man in the pub explaining to everyone, whether they wish to hear it or not, what can easily be done fix the economy (a good list of examples can be found here).  The reasons behind this behaviour are complex, but most likely centre around confirmation bias.  When we see news stories, listen to anecdotes or read medical and scientific reports our brain casually and subtly focus on the ones that support our personal longstanding beliefs and steadily build up a collection of ‘evidence’ in our own mind that these are true, giving us false confidence in these beliefs.  When mixed with issues to do with assessing our own abilities this becomes even more extreme, as the confirmation bias mixes with self-esteem to lead us to believe the things about ourselves we often want to be true.  This is why almost everyone thinks they have higher IQ, are a better driver and are more charitable than those around them (a phenomenon known as Illusory Superiority).  Ironically when pointed out to people this bias applies even to cognitive biases, with most of us thinking that we are less likely to be affected by them than others, known as the ‘bias blind spot’.

There is also a second, smaller part of the Dunning-Kruger effect, where people who are expert at thing underestimate their own level of knowledge, and overestimate the expertise of others.

Screen Shot 2017-10-15 at 12.48.45.png

It is along these this logical lines that led to the creation of learning stages, shown in figure 1, to be developed in the 1970’s, where the first stage is Unconscious Incompetence.  This is sometimes termed the competence hierarchy (which Wikipedia draws as a pyramid, figure 2, below).  Essentially when someone doesn’t know anything about a subject, they lack the necessary information to understand just how complex and difficult it is, and the pitfalls one may fall into addressing it.  The second step, termed Conscious Incompetence, is telling.  Those honest with themselves will recall the moment early on in learning a new skill when we realised just how difficult a task it would be, and how little we knew when we began.  

Screen Shot 2017-10-15 at 12.49.19.png

However, for those well-trained professionals plying their trade every day this surely cannot be a problem?  But alas, being a highly trained medical or scientific specialist does not remove one from these biases, and in fact the extra confidence we have in our abilities can often blind us further to mistakes.  Studies of medical students self-assessment of their interviewing techniques shows almost all enormously overrate their abilities.  When teaching students how to diagnose, interpret evidence and deal with complex problems it is curious that we spend such effort teaching the facts by syllabus but spend so little effort teaching them how to recognise errors in themselves and their contemporaries.  Also we must acknowledge that we went through this training process ourselves, and must deal with the uncomfortable idea that we are all subject to these effects, at both ends of the spectrum.

So being as we are all blind to this in ourselves when we most need to see it, how can we possibly overcome this cognitive catch-22?  The answer lies in the often-difficult and painful process of continuous feedback and peer review.  Being as the nature of our own minds works against us in seeing our own deficits, the only way we can truly see our flaws is to take advice from others.  But of course, those who really think they do know it all are the least likely to do this.  For all of us, at any stage of our career, the first step to improving a skill long ignored is to ask others, if we can bring ourselves to do it.

Reference papers:

http://templatelab.com/dunning-kruger-effect-study/

http://files.clps.brown.edu/jkrueger/journal_articles/krueger-2002-unskilled.pdf

http://www.ncbi.nlm.nih.gov/pubmed/11597883