The other day the Sarge and I were driving to a conference in the Midwest while doing one of our favorite activities, listening to a techno-thriller. Our favorite author in this genre, Brad Thor, not only writes exciting novels, but very realistic ones as well. I've heard it called "fraction," half fiction, half fact. In this audio book, just before the protagonist proceeds to take out a group of Somali pirates, he remarks that even pirates sometimes suffer something he calls "normalcy bias." I'd never heard the term before, but it was obviously something very bad for the pirates and really good for our hero. As I often do when reading or listening to "fraction," I took time to look up this phenomenon.
I learned that "normalcy bias" causes an individual, a group, or even a government to interpret a critical incident or crisis situation with a dangerous amount of optimism. What's worse, when finally confronted with a crisis, those suffering from normalcy bias tend to woefully underestimate the possible negative impact of the crisis or disaster. Worse still, they often live with the assumption that, since the potential bad event has never happened to them before, it never will happen to them.
In law enforcement we control normalcy bias with training, training, and more training. We put ourselves in the context where we're constantly facing a variety of threats, like an active shooter, a traffic stop gone bad, or an officer ambush when serving a warrant. This not only permits us to see the bad event coming, but it gives us a mental model for successful resolution.