Now I have no problems with the actual content of the discipline; on the contrary, I am deeply fascinated by the complexity and the power of statistics, even given how little I know, and am always interested in learning more about the field. Nevertheless, it seems true that human beings are simply not hard-wired to intuit probability and statistics; we would much rather deal in hard certainties than in fuzzy likelihoods. (Think about how easy it is to catch a ball, and about how complicated the actual physical model has to be to explain it!) This is perhaps why there is such blatant abuse of statistics everywhere, from tabloids to otherwise-respectable economic journals. Were this field of knowledge a living creature, tender-hearted defenders would have undoubtedly established numerous Societies for the Prevention of Cruelty to Statistics throughout the world by now.
This absolutely fascinating article in The Atlantic tells the story of one man, Dr. John Ioannidis, who has taken up the challenge to document and understand the many ways in which models and inference and statistical significance are misunderstood and misused by medical researchers, sometimes with highly suboptimal patient outcomes. If I started citing from the article, I would end up quoting virtually all of it here, so I will content myself with quoting only the lede:
Much of what medical researchers conclude in their studies is misleading, exaggerated, or flat-out wrong. So why are doctors—to a striking extent—still drawing up on misinformation in their everyday practice? Dr. John Ioannidis has spent his career challenging his peers by exposing their bad science.How can people be made more aware of what's going on? And what may be some of the deeper causes of such abuse? Two different sources have interesting responses to this. The first, a blog post for the New York Times titled "The Dark Art of Statistical Deception"interviews Charles Seife, author of Proofiness: The Dark Arts of Mathematical Deception. From the interview, he seems to be an entertaining writer with a penchant for coining portmanteaux like "causuistry" and "randumbness". This injects some humor into the proceedings, while also demonstrating that such fallacies are so deep-seated as to lack common descriptions in our language.
The second is a provocative study of another discipline, modern economics in this case, and the ways in which cognitive dissonance may be affecting the ways in which economists approach and understand the current economic crisis. The study by Adam Kessler in the Real-World Economics Review, called "Cognitive dissonance, the global financial crisis and the discipline of economics", relies on statistics to make its point, and as such is subject to the critiques of statistics that Seife has already brought up. (Indeed, the very name of the journal in which it has been published suggests that it has a certain ideological slant that may predispose it towards particular conclusions.) But it is useful if only because it demonstrates how sheer intelligence (let us not forget that economists are among the sharpest knives in the drawer) and academic approval are no defenses against our innate, fundamental desire to be correct. In other words, it serves, at the very least, as a call to humility. From its abstract:
The global financial and economic crisis has produced a powerful shock to the worldview of an influential group of economists whom I call believers in laissez faire (BLF). I provide evidence which suggests that the BLF responded to this shock in a manner that can best be described as irrational, ill-considered and clearly erroneous. I consider the social-psychological concept cognitive dissonance [emphasis in original] as the best explanatory framework for understanding this response. Cognitive dissonance theory predicts that when real-world events “disconfirm” deeply-held beliefs this creates psychological discomfort in persons and they will respond by means of distortion and denial.
No comments:
Post a Comment