In emergency response literature, a recurring finding cuts against intuition: the most dangerous person in a crisis often isn’t the panicked novice. It’s the confident expert โ or someone who believes they’re one. Overconfidence kills decision quality at exactly the moment quality matters most, and the data from search-and-rescue, aviation, and medicine all point the same way.
The “I’ve done this before” problem
A 2014 study of backcountry skiing avalanche fatalities, published in the Proceedings of the International Snow Science Workshop, found that experienced skiers were more likely than beginners to die in avalanches. Beginners tended to stay on safer terrain or follow guides; experienced skiers more often ventured into terrain their pattern-matching had previously rewarded. The technical term is “heuristic trap” โ the brain treats prior success as evidence of safety even when conditions have changed.
Aviation safety research finds similar patterns. The NTSB has repeatedly documented general aviation crashes where pilots with several hundred hours had worse risk-management profiles than those with fewer or many more. Mid-experience pilots are confident enough to push weather minimums but not yet seasoned enough to recognize when their luck has been doing the work.
Confidence narrows the field of options
In emergencies, what saves people is usually flexibility โ the ability to abandon a plan that isn’t working, ask for help, or downgrade an objective. Overconfident actors do the opposite. They commit to a chosen course because admitting uncertainty feels worse than the perceived cost of the action.
This shows up in wilderness search-and-rescue case files. The person who continues hiking when they should turn back, the boater who doesn’t put on the life jacket, the driver who tries to cross a flooded road โ these decisions are rarely made by people who feel uncertain. They’re made by people who feel they have a handle on the situation. Federal Highway Administration data shows that more than half of flood-related vehicle deaths involve drivers who deliberately drove around barricades.
Training that includes humility outperforms training that doesn’t
The fix isn’t less training โ it’s training that explicitly addresses uncertainty. Aviation’s crew resource management curriculum, adopted after a string of crashes in the 1970s and 80s, teaches pilots to question their own judgment, listen to crew, and treat doubt as data. The accident rate declined sharply afterward.
Wilderness medicine programs now emphasize “what you don’t know” frameworks. Avalanche education has shifted toward teaching skiers to identify the conditions under which their judgment is most likely to fail. The common element is structured humility โ not vague modesty, but concrete drills that produce comfort with saying “I’m not sure.”
Bottom line
Confidence is useful in emergencies, but only when paired with an honest model of what could go wrong. The data across multiple high-stakes fields suggests that people who think they know what they’re doing make worse decisions than people who actively check themselves. In a crisis, the right question isn’t “can I handle this?” โ it’s “what would tell me I can’t?”
Leave a Reply