As I said below I’m exploring some ideas about optimising certain aspects of bomb technician training, to address what I think are some current weaknesses in terms of recognising a psychological bias by bomb technicians during the assessment phase of their operations to “see what they expect to see” and without realising they fall into an analytical bias or a “confirmatory bias”. I must say up front I’m no trained psychologist and this needs lots more work, but for now I’m being guided by Dan Gardner’s book “Risk” which, at least for me as a layman is explaining some of things I think I’ve seen in bomb tech training and operations over the last twenty years.
It would appear that the brain, when dealing with matters of threat and fear has two “systems”.
- System 1 is driven by things such as feeling, emotion, instinct, “gut”. It seems to rely on the “availability heuristic” (meaning that the brain remembers exceptions rather than routine).It drives certain physiological responses such as an increased heart rate and can drive the “fight or flight” response. It is quick – taking seconds or less to come to conclusions. When asked to justify a decision made by “System 1” operators can rarely put it into words.
- System 2 is driven by reason, by the “head” rather than the gut. It uses logic, and is slow. The decisions are more “conscious” and can be explained. Sometimes the “head” is lazy and must be pushed to adjust decisions or analysis made by the “gut”. This happens especially under stress or when rushed.
The problems come because:
a. System 1, although very effective in some situations can give rise to deeply flawed threat assessments (to mix bomb tech language with psychology language) because there hasn’t been a systematic , objective view of all the evidence. But in some situations speed is needed and there is simply no time for system 2.
b. System 2 sometimes is used to rationalise a decision or assessment already made by system 1, so it too ends up being flawed. It can be wrong, and the confirmatory bias to look for confirmation rather than “dis-confirmation” can lead bomb techs who may be under significant tactical and operational pressure to make poor “threat assessments”. System 2 is often stymied because system 1 has already made the key decisions, and system 2 just tags along rationalising it. Once a belief is established the brain will look to confirm it and discard contrary information without logic
So, a few implications jump to my mind:
- I think that SOPs work well to support decision making in time sensitive situations. They provide a structure giving the bomb tech a minimum amount of choices to make in time-stressed situations. They essentially provide structured decision making that addresses the weaknesses of system 1. But…. could we make the SOPs better by having a greater understanding of the psychology of the situation?
- SOPs also work well when they encourage predetermined actions (“drills”) that cover all the bases, so even if a bomb tech has discarded the threat, say, of a pressure plate booby trap, the SOP still demands he acts is if there may be one.
- How do we encourage “system 2”, the head, to be less lazy and more effective at questioning system 1, gut instincts? Dan Gardner makes a nice analogy here, describing the brain as a car, driven by a caveman with a lazy, but bright PhD student in the passenger seat. The bright student is inclined to grab the steering wheel, but sometimes is distracted and just sits back and looks out at the window.
- Should we think about the ways that Bomb techs receive intelligence, to ameliorate the negative influences of psychological response to fear? Do intelligence analysts in general think about how their intelligence analysis affects the psychological decision making process going on their reader’s heads? ( I think this questions is much bigger than the small focus of bomb techs, and perhaps deserves a broader study on its own.) How do we manage the delivery of intelligence in the light of the “availability heuristic”, and something called the Von Restorff effect,(where we remember the unusual over the routine) as well as reasoned analysis? System 1 responses seem to be encouraged with images and narrative, even anecdotes (bomb techs are very good at anecdotes!). System 2 with logic, data and numbers. How do we manage this? There are some interesting things that fear does in this process. I think EOD scenarios are inherently fearful situations so they are worth exploring. The brain culls low risk experiences and memories, but fear “glues” memory. Fearful situations and resultant memories are more likely to influence “system 1” thinking. I wonder if we can “glue” important intelligence with a bit of fear to make it stick better? Or do we encounter the danger of over emphasising something that way and encouraging a system 1 response?
- Its clear from studies that “system 2” is aided by improved numeracy. Has this sort of research been applied to bomb technicians? – I think the answer is no, but some bomb tech training schools have very high academic/scientific training standards – others do not. Perhaps the output isn’t always what we expect – better technical understanding, but the associated higher levels of numeracy may also be a hidden factor in effective bomb technician training, because they encourage more effective system 2 analysis.
Here’s a great example of the confirmation bias. I’m thinking of a mathematical rule, that gives me three initial digits – 2, 4, 6. You are the bomb tech assessing this situation or “rule”. You have to make a judgement on what the next three numbers are, and you can test your hypothesis by giving me the next three numbers. The chances are, you will give me “8, 10 12”, and that would fit the rule and be correct…..and if you were challenged again you’d probably say “14, 16, 18”, and again, that would fit the rule wouldn’t it? But, actually the rule I had in my mind was “ any three rising numbers”. You see, your inclination was to see a simple even number, by two increase, and you have a bias to confirm that assessment which you couldn’t resist. But if you’d given me , say at the second time of asking an alternate test of your theory, such as “13, 14, 17” you’d have properly tested your gut, system 1 solution with an alternate “threat assessment” by using good logical system 2 brain power.
I’ll return with more thoughts on this shortly.