By coincidence the New York Times published this article on the same day as my post below and discusses some similar aspects.
By coincidence the New York Times published this article on the same day as my post below and discusses some similar aspects.
As I said below I’m exploring some ideas about optimising certain aspects of bomb technician training, to address what I think are some current weaknesses in terms of recognising a psychological bias by bomb technicians during the assessment phase of their operations to “see what they expect to see” and without realising they fall into an analytical bias or a “confirmatory bias”. I must say up front I’m no trained psychologist and this needs lots more work, but for now I’m being guided by Dan Gardner’s book “Risk” which, at least for me as a layman is explaining some of things I think I’ve seen in bomb tech training and operations over the last twenty years.
It would appear that the brain, when dealing with matters of threat and fear has two “systems”.
The problems come because:
a. System 1, although very effective in some situations can give rise to deeply flawed threat assessments (to mix bomb tech language with psychology language) because there hasn’t been a systematic , objective view of all the evidence. But in some situations speed is needed and there is simply no time for system 2.
b. System 2 sometimes is used to rationalise a decision or assessment already made by system 1, so it too ends up being flawed. It can be wrong, and the confirmatory bias to look for confirmation rather than “dis-confirmation” can lead bomb techs who may be under significant tactical and operational pressure to make poor “threat assessments”. System 2 is often stymied because system 1 has already made the key decisions, and system 2 just tags along rationalising it. Once a belief is established the brain will look to confirm it and discard contrary information without logic
So, a few implications jump to my mind:
Here’s a great example of the confirmation bias. I’m thinking of a mathematical rule, that gives me three initial digits – 2, 4, 6. You are the bomb tech assessing this situation or “rule”. You have to make a judgement on what the next three numbers are, and you can test your hypothesis by giving me the next three numbers. The chances are, you will give me “8, 10 12”, and that would fit the rule and be correct…..and if you were challenged again you’d probably say “14, 16, 18”, and again, that would fit the rule wouldn’t it? But, actually the rule I had in my mind was “ any three rising numbers”. You see, your inclination was to see a simple even number, by two increase, and you have a bias to confirm that assessment which you couldn’t resist. But if you’d given me , say at the second time of asking an alternate test of your theory, such as “13, 14, 17” you’d have properly tested your gut, system 1 solution with an alternate “threat assessment” by using good logical system 2 brain power.
I’ll return with more thoughts on this shortly.
I’ve been thinking about the training of bomb technicians to deal with IEDs quite a lot of late. One of my oldest friends is currently responsible for such training in the British military and having myself undergone such training in the past (way past!), and been a “customer” employing people from that training regime, and now for the last ten years involved in designing and delivering training for bomb technicians in many countries around the world, not surprisingly I have some views, but I also have gaps in my knowledge I’m trying to fill. I intend to air some aspects of what I’m finding over the coming weeks.
Here’s some thoughts for starters:
I’ll be returning to this subject in more detail in coming days – feel free to pile in with comments now.