A few weeks ago, I briefly examined a way of thinking that characterizes different types of knowledge as known knowns, known unknowns, and unknown unknowns. I think this is useful both as a knowledge classification system and a way to shoehorn Donald Rumsfeld into an article about medicine. Eagle-eyed readers and matrix enthusiasts, however, undoubtedly noted that I neglected to mention the obvious fourth category: unknown knowns

Unknown knowns are ideas that we’re aware of on some level but, for whatever reason, choose not to admit to ourselves. A more elegant moniker for this concept might be willful blindness. In the Rumsfeldian world, an example of willful blindness may have been the treatment of prisoners at Abu Ghraib. 

In surgery, a common example is continuing to perform a procedure even in the face of evidence that it’s either outdated or of little benefit: orthopedists performing arthroscopies on arthritic patients when washing the joint out would be just as effective, or the stubborn persistence of gastric banding in spite of mounting data in support of sleeves or bypasses. We know what the right answer is, but for some reason, we do something else. We should know better. We are willfully blind.

The impetus for willful blindness probably comes from a number of sources, but unfortunately the motive — subconsciously or otherwise — is frequently financial. Margaret Heffernan explored this idea in her aptly-titled book from a few years ago called “Willful Blindness: Why We Ignore the Obvious at Our Peril.” 

In one section, Heffernan looks at the propensity of some surgeons to perform expensive operations that aren’t strictly necessary, presumably for monetary gain, and concludes that the offenders are making the same cognitive errors as the people ultimately responsible for the subprime mortgage crisis or the Deepwater Horizon oil spill.