Saturday, December 03, 2011

My hunch is that I suffer from "the illusion of validity" because "thinking is hard work"

As a kid, I was quite a serious student of math and the sciences, because of which I wanted to understand the world in a systematic manner.  The saffron-robe clothed people offering their judgments on the world was incompatible with what I was looking for. 

Now, in my professional life, when I ask for students' take on the readings, often my follow-up comment/question would be along the lines of "you say that because ..."  I want them to step beyond relying on their instinct--the "System One" in Daniel Kahneman's formulation, which Freeman Dyson reviews in the NYRB:
System One is amazingly fast, allowing us to recognize faces and understand speech in a fraction of a second. It must have evolved from the ancient little brains that allowed our agile mammalian ancestors to survive in a world of big reptilian predators. Survival in the jungle requires a brain that makes quick decisions based on limited information. Intuition is the name we give to judgments based on the quick action of System One. It makes judgments and takes action without waiting for our conscious awareness to catch up with it. The most remarkable fact about System One is that it has immediate access to a vast store of memories that it uses as a basis for judgment. The memories that are most accessible are those associated with strong emotions, with fear and pain and hatred. The resulting judgments are often wrong, but in the world of the jungle it is safer to be wrong and quick than to be right and slow.
It is not merely students, of course, who go with their instincts.  This one I blogged about is a classic example of how even faculty trained to think otherwise fall victims to "the illusion of validity."  Kahneman talked about this:
what you can get are people jumping to conclusions. I call this a "machine for jumping to conclusions".  And the jumping to conclusions is immediate, and very small samples, and furthermore from unreliable information. You can give details and say this information is probably not reliable, and unless it is rejected as a lie, people will draw full inferences from it. What you see is all there is. Now, that will very often create a flaw. It will create overconfidence. The confidence that people have in their beliefs is not a measure of the quality of evidence, it is not a judgment of the quality of the evidence but it is a judgment of the coherence of the story that the mind has managed to construct. Quite often you can construct very good stories out of very little evidence, when there is little evidence, no conflict, and the story is going to end up good. People tend to have great belief, great faith in stories that are based on very little evidence
Many students in my classes ought to be familiar, by now, that a comment that I often have when I review their essays is, "where is the supporting argument/evidence for this?"  I point that out with the hope that they would not start believing in stories that, sometimes, like with some of my division colleagues' lectures, are devoid of evidence. 

If we started thinking along these lines, an obvious question then is why we don't then ditch that System One, and rely on System Two instead?
Why do we not abandon the error-prone System One and let the more reliable System Two rule our lives? Kahneman gives a simple answer to this question: System Two is lazy. To activate System Two requires mental effort. Mental effort is costly in time and also in calories. Precise measurements of blood chemistry show that consumption of glucose increases when System Two is active. Thinking is hard work, and our daily lives are organized so as to economize on thinking.
I so much love that phrase there: "thinking is hard work."  It seems like we do everything possible to avoid thinking.  So much so that even a "thinking" idiot like me comes across to students as a philosopher!  If only students knew the real story that I am a fake :)

No comments: