Is Self-Knowledge Overrated?

Discussion in 'Psychology' started by bigarrow, Nov 2, 2011.

  1. Interesting article below:


    Is Self-Knowledge Overrated?

    Daniel Kahneman, a Nobel Prize-winning psychologist and the author of the new book “Thinking, Fast and Slow,” changed the way people think about thinking by asking them questions. They weren’t trick questions, either. Instead, Kahneman relied almost exclusively on straightforward surveys, in which he described various scenarios. Here’s a sample:

    The U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. If program A is adopted, 200 people will be saved. If program B is adopted, there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved. Which of the two programs would you favor?
    When Kahneman put this question to a few hundred physicians, seventy-two per cent chose option A, the safe-and-sure strategy. Most doctors would rather save a certain number of people for sure than risk the possibility that everyone might die.

    Now consider this scenario:

    The U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. If program C is adopted, 400 people will die. If program D is adopted, there is a one-third probability that nobody will die and a two-thirds probability that 600 people will die. Which of the two programs would you favor?
    The two different hypotheticals, of course, examine identical dilemmas: saving one-third of the population is the same as losing two-thirds. And yet, doctors reacted very differently depending on how the question was framed. When the possible outcomes were stated in terms of deaths (and not survivors), physicians were suddenly eager to take chances: seventy-eight per cent chose option D.

    Why are doctors so inconsistent? Kahneman and his longtime collaborator, Amos Tversky, explained these contradictory responses in terms of loss aversion, or the fact that losses hurt more than gains feel good. In fact, people hate losses so much that merely framing a choice in terms of a potential loss can shift their preferences. Like those physicians, people are suddenly willing to risk losing everything if there’s a chance they might lose nothing.

    Although our dislike of losses might seem obvious—“You need to have studied economics for many years before you’d be surprised by my research; it didn’t shock my mother at all,” Kahneman says—the discovery of loss aversion proved to be an important refutation of human rationality. Unlike homo economicus, that imaginary species featured in macroeconomics textbooks, Kahneman and Tversky demonstrated that real people don’t deal with uncertainty by carefully evaluating all of the relevant information. They stink at statistics and rarely maximize utility. Instead, their choices depend on a long list of mental short cuts and intemperate emotions, which often lead them to pick the wrong options.

    Since the Israeli psychologists began studying loss aversion in the early nineteen-seventies, it has been used to explain a stunning variety of irrational behaviors, from the misguided decisions of investors—they refuse to sell losing stocks—to the stickiness of condo prices in the aftermath of a housing bubble. It’s been used to justify our fondness for the status quo—the present may stink, but we still don’t want to lose it—and the cowardice of N.F.L. coaches, who are far too afraid to go for it on fourth down. Loss aversion even excuses our social habits: studies have shown that it generally takes at least five kind comments to compensate for a single criticism. (The ratios are even worse for criminals: a person convicted of murder must perform at least twenty-five acts of “life-saving heroism” before he is forgiven.) This is an impressive amount of explanatory firepower for a theory rooted in hypotheticals.

    It’s impossible to overstate the influence of Kahneman and Tversky. Like Darwin, they helped to dismantle a longstanding myth of human exceptionalism. Although we’d always seen ourselves as rational creatures—this was our Promethean gift—it turns out that human reason is rather feeble, easily overwhelmed by ancient instincts and lazy biases. The mind is a deeply flawed machine.

    Nevertheless, there is a subtle optimism lurking in all of Kahneman’s work: it is the hope that self-awareness is a form of salvation, that if we know about our mental mistakes, we can avoid them. One day, we will learn to equally weigh losses and gains; science can help us escape from the cycle of human error. As Kahneman and Tversky noted in the final sentence of their classic 1974 paper, “A better understanding of these heuristics and of the biases to which they lead could improve judgments and decisions in situations of uncertainty.” Unfortunately, such hopes appear to be unfounded. Self-knowledge isn’t a cure for irrationality; even when we know why we stumble, we still find a way to fall.

    Consider the story of Harry Markowitz, a Nobel Prize-winning economist who largely invented the field of investment-portfolio theory. By relying on a set of complicated equations, Markowitz was able to calculate the optimal mix of financial assets. (Due to loss-aversion, most investors hold too many low-risk bonds, but Markowitz’s work helped minimize the effect of the bias by mathematizing the decision.) Markowitz, however, was incapable of using his own research, at least when setting up his personal retirement fund. “I should have computed the historical co-variances of the asset classes and drawn an efficient frontier,” Markowitz later confessed. “Instead, I visualized my grief if the stock market … went way down and I was completely in it. My intention was to minimize my future regret. So I split my contributions 50/50 between bonds and equities.”

    Football coaches have performed just as badly. Although it’s now clear that their biases have a meaningful impact—a coach immune to loss aversion would win one more game in three seasons out of every four—their collective decision-making hasn’t improved.

    This same theme applies to practically all of our thinking errors: self-knowledge is surprisingly useless. Teaching people about the hazards of multitasking doesn’t lead to less texting in the car; learning about the weakness of the will doesn’t increase the success of diets; knowing that most people are overconfident about the future doesn’t make us more realistic. (We’re a bundle of contradictions. Kahneman has also studied the overconfidence bias: The vast majority of entrepreneurs believe they will be in business in five years even though sixty-five per cent fold.) The problem isn’t that we’re stupid—it’s that we’re so damn stubborn.

    Kahneman, of course, knows all this. One of the most refreshing things about “Thinking, Fast and Slow” is his deep sense of modesty: he is that rare guru who doesn’t promise to change your life. In fact, Kahneman admits that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes. As a result, his goals for his work are charmingly narrow: he merely hopes to “enrich the vocabulary that people use” when they talk about the mind.

    This new book will certainly accomplish that—Kahneman has given us a new set of labels for our shortcomings. But his greatest legacy, perhaps, is also his bleakest: By categorizing our cognitive flaws, documenting not just our errors but also their embarrassing predictability, he has revealed the hollowness of a very ancient aspiration. Knowing thyself is not enough. Not even close.
    newyorker.comby JONAH LEHRER • OCT. 25, 2011
     
  2. The U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people.

    Two programs to combat the disease have been proposed.

    Program A is: Do I have enough Merck in my portfolio?

    Program B is: How will this effect the Asian vote?

    Know thyself? More like Know thy Congressman.
     
  3. Roark

    Roark

    Nobel prizes are way overrated. They don't seem to mean jack anymore. They're more like a mark of idiocy then genius.
     
  4. He rediscovered America.
     
  5. bone

    bone

    At least half of the really successful traders I have met during my career are absolutely not what anyone would remotely consider to be anything above average in terms of IQ or a Wunderlich or whatever metric you chose.

    Some are brilliant in every way - a minority, however from what I have observed.

    Takes all kinds.