Picture a late morning scene at a watering hole in Etosha National Park in Namibia. The sun is getting close to its apex position in the sky, its fierce rays pierce through the cloudless sky onto the ground. All types of animals from a family of majestic African elephants to a tower of leggy giraffes and a herd of skittish springbok are quenching their thirst or generally enjoying the water. Then abruptly and discontinuously, all of the animals (with the exceptions of the elephants) come on alert and scatter away leaving a nearly empty watering hole. Us humans, perched omnisciently at a distance in a bus, become aware of the presence of two lionesses slowly ambulating their way towards the water. However, they do not come down all the way down to the pond but rather plop down at the periphery, underneath a bush. Then a few minutes later, we see a lone springbok approach the watering site scanning for the threat of theoretical lions but unaware of the two real lionesses under the bush.
Former Secretary of State, Donald Rumsfeld stated that “there are known knowns; things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know.” The young springbok knows of lions (known-knowns) and is aware to be on the lookout for these predators (known-unknowns) because of an innate wariness of lions and experiential knowledge of prior threatening encounters with lions. This quantifiable uncertainty is defined as risk and navigating this risk, the springbok decided to leave its herd and head to the watering hole for a drink. Uncertainty implies risk, which in turn implies danger, and thus the possibility of loss. On the other hand, unquantifiable uncertainty (unknown-unknowns) are events that none of the animals at the watering hole can foresee, predict, or model – maybe an asteroid strike or an earthquake. At best, experience can make you aware of them, hindsight can help you explain them, and foresight (with luck) can maybe help you survive them.
My world of medical decision making in the emergency department is challenging and fraught with pitfalls. Patients present with problems that have physical, psychological, and social components. Decisions are often time delimited and physicians are frequently operating at their cognitive and affective limits. Emergency physicians live in a world of risk and uncertainty. Danger and the threat of loss loom large and are foundational prime movers of all decisions. Every theoretically low-risk atypical chest pain is a heart attack or a pulmonary embolism, every seemingly mild headache is a brain bleed or meningitis, and every vague abdominal pain is an aortic dissection. Emergency physicians struggle to narrow the chasm between the known and the unknown (next essay) and justifiably seek to insure against known and unknown catastrophes through systemic mechanisms of over-testing and over-treating. Rather than describing the behavior as being risk-averse or risk-seeking, I would describe the behavior as an adaptive response to the systemic values, norms, incentives, and disincentives in place. Humans are cultural species and acquire behaviors from the systems that surround them.
The twentieth-century polymath GK Chesterton wrote, “the real trouble with this world is not that it is an unreasonable world or even that it is a reasonable one. The commonest kind of trouble is that it is nearly reasonable, but not quite. Life is not an illogicality; yet it is a trap for logicians. It looks just a little more mathematical and regular than it is; its exactitude is obvious, but its inexactitude is hidden; its wildness lies in wait.” The wildness of a Namibian watering hole, the inexactitude of medicine, and the trap of certainty are all apt metaphors for medical decision making in the emergency department where risk is paramount and the known-unknowns and the unknown-unknowns drive decision making. Improving systems is complicated and the behavior of complex adaptive systems is more than the sum of its parts. To understand the deepest malfunctions of a system, it is important to pay attention to the rules and who has the authority over them.