On thinking – observations from a hospital bed side

My wife has been hospitalised recently. The bottom of sodium and potassium levels in her blood has fallen out. She had to be put on life support and at one stage permanent brain damage and even worse could not be ruled out. As I was sitting next to her bed in ICU, I had the opportunity to observe not only the actions of the medicos but also their thinking in a clinical crisis. I shall attempt to say in a few moments not only what I observed, but also a bit of commentary on the observations. I would then conclude with some application of what I learned, in my own field of expertise.

As we arrived at the emergency department of the hospital, the triage nurse assessed my wife. Since all her vitals (blood pressure, pulse rate, temperature, etc.) were within normal limits, she concluded that her case was not highly urgent. Unfortunately, the low sodium level didn’t agree with this assessment. While I accept that it is not easy to pick up such thing as low sodium level in the blood as potential clues are not obvious, I must say that all indicators pointing to this direction were given to the triage nurse.

Cognitive biases

The error one can identify here is sometimes called “anchoring”. Anchoring is a cognitive bias, focusing on the first piece of information provided, and use only that as the basis for decision making. This is especially tempting when there is pressure for a firm answer and there is little room for considering ambiguity. In other words, anchoring is enhanced by the need for cognitive closure.

After the transition from ER to the ICU, routine tasks were performed. Life support system, monitoring equipment, IV tubes, drains, catheters, etc. were put in place. There was no need to think what is next. The patient needed to be stabilised, so attention was given to basic tasks to achieve that and they were carried out effectively.

The thinking at this stage can be called reactive thinking. In sports, players practice certain moves until those moves become automatic. This enables them to react and move faster than the opponent during a game. Reactive thinking is also how one can build habits. Habitual responses become automatic and carried out without thinking. One danger that is worth noting though is that this can relegate judgment and focus into secondary importance.

Once my wife was stabilised, doctors started to think about therapy and recovery. Many questions were asked about what occurred, to piece together the cause. There was some ambiguity, as what happened didn’t satisfactorily explain the symptoms. In other words the picture didn’t fully fit. The intensivist and the registrar kept asking themselves and the less senior physicians what was missing and what didn’t fit. While it is sort of against the parsimony doctors are trained to apply to their thinking, in this case the question whether there might be more than one illness had to be considered.

All efforts were made to avoid cognitive tunneling at this stage. Cognitive tunneling is also called perceptual blindness. People get so focused on what is directly in front of their eyes that they ignore other elements of the whole picture. They focus on one thing and one thing only. In situations where life or death decisions must be made under uncertainty, cognitive tunneling can be fatal. And there is another point, which in fact is the core of this: the tyranny of the urgent can easily cause people acting on the pressing needs and to overlook the important.

It is worth pausing here for a minute to say something about the tyranny of the urgent. In a sense it is a kind of cognitive tunneling. The tyranny of the urgent is not the shortage of, or the lack of available time. It is the difficulty of setting priorities right. Many have set a budget to see how they spend their money. Very few have set a budget to see how they spend their time. Those who did, very soon realised their addiction to the urgent and how that affected their choices. Which leads me to say that more often than we’d like to admit, we waste our time doing the “urgent” and neglect what is important.

Coming back to cognitive tunneling; asking certain questions from doctors can be helpful to aid their cognitive processes. “Is there something that doesn’t fit?” “Is there something missing?” “Based on the symptoms you see, what do you think the cause is?” “Could it be something else?” “What else could that be?” “Is it possible that there is more than one problem?” And perhaps the most important one: “What if not?” Doctors are usually appreciative for such questions as they help them to pause and allow them to think more broadly.

Questions, as it was once said, are the principal intellectual instruments available to human beings. Aptly formed questions can lead to finding new facts, establishing new perspectives and new ideas. They also help to pause and break away from the tyranny of urgent.

So what?

I am neither qualified to give, nor desirious of giving advice to doctors. They are as a matter of fact more trained in cognitive processes than most of us. I am simply stating that since even well trained doctors need to pay attention to such cognitive errors, it is wise for the rest of us also to pay attention to them. These thought processes including the cognitive biases mentioned above are quite common in business, not just in medicine.

In my area of interest – corporate governance and information security – decisions often need be made under uncertainty. While responding to incidents, – hence the title – the unrecognised desire for cognitive closure and cognitive tunneling enhanced by the tyranny of the urgent can cause making decisions that are less than desired in the long term. Being aware of these cognitive biases can help with the reduction of bad decisions.

There it is. The topic I wanted to bring your attention to is the danger of unchecked cognitive biases. I hope we could embark together on a journey to explore them in more details. I am happy to provide further exposition on these cognitive errors and on questions that can help with reducing or avoiding them; provided there is interest. Please let me know of your thoughts.

Why not?

2 thoughts on “On thinking – observations from a hospital bed side

  1. Hi Endre.

    Interesting case study! Cognitive biases and limitations are fascinating, with a huge range of implications on life in general, as well as infosec. I’m intrigued by the differences and interplay between cognitive/conscious thought and instinctive responses – the brain’s shortcuts for life-threatening flee-or-fight decisions – as well as the ways that we practice, learn and internalize stuff until it becomes habitual or subconscious.

    As a specific tech example, we often click the “go away” button on repetitive error/warning messages without actually reading the text, an action so well practiced that we don’t feel the need to invest many brain-cycles on it. That, in turn, leads to issues for software developers around how to grab the user’s attention and avoid the near-instinctive go-away click to pass on something different and important that (in the developer’s opinion?) the user really should think more carefully about … and begs a stack of questions about identifying ‘something different’ (implying some sort of historical/statistical activity tracking) and ‘something important’ (which is context-dependent and quite subjective), as well as the attention-grabbing presentation techniques (bells, whistles, bright colours, flashing, electric shocks …).

    Another relevant example is the whole field of security awareness and training – getting people to decide and act ‘more securely’ where, again, ‘securely’ is context dependent and ‘more’ implies an historical element, continuous improvement maybe. Repetition is perhaps the most common teaching technique, with a measure of explanation, persuasion, practice and compliance enforcement … but reinforcing positive behaviours is, I feel, an approach sadly neglected by most awareness and training pro’s. Getting back to your case study, many/most of the docs and nurses do what they do because they love it, I suspect, more than just to be paid. They feel compelled to do it. It’s a vocation, a passion for them. So how can we empassion workers on infosec, risk management, privacy and all that? [I don’t really know the answer to that, despite having slaved, passionately, in this very field for decades!]

    By the way, how is your wife? Some things are even more important than infosec!

    1. Thanks Gary,
      Let me pick some issues randomly, as this is a vast field again.

      1. I do not see cognitive biases being bad in and by themselves. When we do not check our thinking – and the inherent biases – , this is where I see the problems arising. Bringing those biases from the subconscious level to the conscious is essential. Therefore developing conscious (I use the word in a different sense as in the previous sentence 🙂 though) and critical thinking is so important. That’s what I tried to teach in my classes using puzzle based learning.

      2. Empassioning workers in infosec… hmmm…
      Again, I think the answer lies in the thinking. One either thinks security or not. I am using the noun instead of the adverb “securely” deliberately! Once one thinks security, it becomes a worldview in a sense that it will be an underlying principle that will dictate everything we do. Then, and perhaps only then, can empassioning be considered. I am not trying to criticise anyone in particular, nor the profession in general, when I am asserting that this kind of thinking is almost non-existent. Sadly, it was, and is far too simple to prove…

      My wife is fine, thank you. More in person, not in public.

Leave a Reply to Endre Cancel reply

Your email address will not be published. Required fields are marked *