India Uncut
This blog has moved to its own domain. Please visit IndiaUncut.com for the all-new India
Uncut and bookmark it. The new site has much more content and some new sections, and you can read about them here and here. You can subscribe to full RSS feeds of all the sections from here.
This blogspot site will no longer be updated, except in case of emergencies, if the main site suffers a prolonged outage. Thanks - Amit.
Thursday, February 15, 2007
Reason vs Rationalisation
A shorter version of this piece was published today as the second installment of my column, Thinking it Through, in Mint. The first is here.
Often when I argue with friends, or on the internet, I am dismayed by how intransigent some people are. No matter how many facts I throw before them, or how solid my reasoning is, I simply cannot convince them of my point of view. No doubt they feel the same about me. "He refuses to listen to reason," they think, even as I bemoan how unreasonable they are.
This is not a phenomenon peculiar to me: we live in deeply polarised times, and around half the world believes that the other half ignores reason altogether. Well, it is my belief that we overestimate reason to begin with. The Scottish Philosopher David Hume once described reason as “the slave of the passions,” and I believe that much of the time when we feel we are being reasonable, we are actually rationalising conclusions we have already arrived at, positions that we already hold.
An excellent illustration of how our mind does this comes from neuroscience. In the 1960s, neuroscientists Michael Gazzaniga and Roger Sperry carried out a series of experiments on patients with split-brain epilepsy. A common treatment for such patients used to be to sever the corpus callosum, the part of the brain that connects the two hemispheres of the brain. This effectively splits the brain into two: rational thought is carried out by the left hemisphere, but the two halves of the brain stop being aware of the happenings the other half.
Describing the experiments in his book, “The Blank Slate,” Steven Pinker wrote of how “the left hemisphere constantly weaves a coherent but false account of the behaviour chosen without its knowledge by the right.” One example: the experimenters would flash the word “walk” in the visual field of the right hemisphere. The patient would get up and start walking. But when asked why he did so, his left brain, which would be unaware of what the right brain had seen, and would effectively be doing the replying, gave answers such as “to get a coke.” The remarkable thing is that the patients actually believed their explanation, even though the conscious mind arrived at it after the unconscious mind prompted the body to start walking.
Pinker called the conscious mind “a spin doctor, not the commander in chief,” while Gazzaniga referred to the left brain as “the interpreter.” In his book, “Phantoms in the Brain,” VS Ramachandran wrote, “[t]he left hemisphere’s job is to create a belief system or model and to fold new experiences into that belief system. If confronted with some new information that doesn’t fit the model, it relies on Freudian defence mechanisms to deny, repress or confabulate – anything to preserve the status quo.”
In other words, the left brain’s job to to make sense of the world and build a coherent worldview. This isn’t easy. The world is full of complicated phenomena, and the most intelligent among us would not be able to make sense of it all if we tried to place each disparate event in its proper perspective. We would be perpetually bewildered.
To deal with this, our brains evolved to seek patterns in everything. Michael Shermer, in his book “How We believe,” wrote: “Those who were best at finding patterns (standing upwind of game animals is bad for the hunt, cow manure is good for the crops) left behind the most offspring.” Of course, while we are especially good at seeking patterns in everything, not all patterns are meaningful, and many simply come from confusing correlation with causation. Thus, a cricketer who makes a century when he happens to have a red handkerchief in his pocket may carry that handkerchief with him for the rest of his career.
Indeed, this explains religion. For much of our existence, science hasn’t been around (or able) to answer the big questions of the day. We’d have gone mad thinking about it all if we didn’t have religion to give us ready-made patterns that explained everything. Similarly, in the modern world, we have all kinds of belief systems that help make sense of the world around us, and provide us with cognitive shortcuts to think about the world.
When these belief systems are attacked, it is natural for us to not want to have to rethink them. As an economist would say, that would be inefficient, wasting too much time and energy. Thus, various kinds of defence mechanisms originate for this purpose, such as the confirmation bias, which is a tendency to consider only evidence that fits our existing beliefs. A believer in astrology would do this, for example, by considering all correct predictions by an astrologer to be proof of its validity, while ignoring the ones that turn out false.
And indeed, this is why most arguments, especially about politics and economics, are so frustrating. If both sides have firm beliefs, they stand little chance of convincing the other person, for most reasoned argument in such cases is rationalisation couched as reason. The next time you get into one of those arguments, and witness one of them, you will actually be able to observe this happening. The delight of it all is that the people involved will not be aware of this process, and will honestly believe themselves to be open-minded individuals who are, well, thinking it through. But that is mostly self-deception.
Often when I argue with friends, or on the internet, I am dismayed by how intransigent some people are. No matter how many facts I throw before them, or how solid my reasoning is, I simply cannot convince them of my point of view. No doubt they feel the same about me. "He refuses to listen to reason," they think, even as I bemoan how unreasonable they are.
This is not a phenomenon peculiar to me: we live in deeply polarised times, and around half the world believes that the other half ignores reason altogether. Well, it is my belief that we overestimate reason to begin with. The Scottish Philosopher David Hume once described reason as “the slave of the passions,” and I believe that much of the time when we feel we are being reasonable, we are actually rationalising conclusions we have already arrived at, positions that we already hold.
An excellent illustration of how our mind does this comes from neuroscience. In the 1960s, neuroscientists Michael Gazzaniga and Roger Sperry carried out a series of experiments on patients with split-brain epilepsy. A common treatment for such patients used to be to sever the corpus callosum, the part of the brain that connects the two hemispheres of the brain. This effectively splits the brain into two: rational thought is carried out by the left hemisphere, but the two halves of the brain stop being aware of the happenings the other half.
Describing the experiments in his book, “The Blank Slate,” Steven Pinker wrote of how “the left hemisphere constantly weaves a coherent but false account of the behaviour chosen without its knowledge by the right.” One example: the experimenters would flash the word “walk” in the visual field of the right hemisphere. The patient would get up and start walking. But when asked why he did so, his left brain, which would be unaware of what the right brain had seen, and would effectively be doing the replying, gave answers such as “to get a coke.” The remarkable thing is that the patients actually believed their explanation, even though the conscious mind arrived at it after the unconscious mind prompted the body to start walking.
Pinker called the conscious mind “a spin doctor, not the commander in chief,” while Gazzaniga referred to the left brain as “the interpreter.” In his book, “Phantoms in the Brain,” VS Ramachandran wrote, “[t]he left hemisphere’s job is to create a belief system or model and to fold new experiences into that belief system. If confronted with some new information that doesn’t fit the model, it relies on Freudian defence mechanisms to deny, repress or confabulate – anything to preserve the status quo.”
In other words, the left brain’s job to to make sense of the world and build a coherent worldview. This isn’t easy. The world is full of complicated phenomena, and the most intelligent among us would not be able to make sense of it all if we tried to place each disparate event in its proper perspective. We would be perpetually bewildered.
To deal with this, our brains evolved to seek patterns in everything. Michael Shermer, in his book “How We believe,” wrote: “Those who were best at finding patterns (standing upwind of game animals is bad for the hunt, cow manure is good for the crops) left behind the most offspring.” Of course, while we are especially good at seeking patterns in everything, not all patterns are meaningful, and many simply come from confusing correlation with causation. Thus, a cricketer who makes a century when he happens to have a red handkerchief in his pocket may carry that handkerchief with him for the rest of his career.
Indeed, this explains religion. For much of our existence, science hasn’t been around (or able) to answer the big questions of the day. We’d have gone mad thinking about it all if we didn’t have religion to give us ready-made patterns that explained everything. Similarly, in the modern world, we have all kinds of belief systems that help make sense of the world around us, and provide us with cognitive shortcuts to think about the world.
When these belief systems are attacked, it is natural for us to not want to have to rethink them. As an economist would say, that would be inefficient, wasting too much time and energy. Thus, various kinds of defence mechanisms originate for this purpose, such as the confirmation bias, which is a tendency to consider only evidence that fits our existing beliefs. A believer in astrology would do this, for example, by considering all correct predictions by an astrologer to be proof of its validity, while ignoring the ones that turn out false.
And indeed, this is why most arguments, especially about politics and economics, are so frustrating. If both sides have firm beliefs, they stand little chance of convincing the other person, for most reasoned argument in such cases is rationalisation couched as reason. The next time you get into one of those arguments, and witness one of them, you will actually be able to observe this happening. The delight of it all is that the people involved will not be aware of this process, and will honestly believe themselves to be open-minded individuals who are, well, thinking it through. But that is mostly self-deception.