General Topics
Markets
Technical Topics
Brokerage Firms
Community Lounge
Site Support

# Is Self-Knowledge Overrated?

Discussion in 'Psychology' started by bigarrow, Nov 2, 2011.

1. ### bigarrow

Interesting article below:

Is Self-Knowledge Overrated?

Daniel Kahneman, a Nobel Prize-winning psychologist and the author of the new book âThinking, Fast and Slow,â changed the way people think about thinking by asking them questions. They werenât trick questions, either. Instead, Kahneman relied almost exclusively on straightforward surveys, in which he described various scenarios. Hereâs a sample:

The U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. If program A is adopted, 200 people will be saved. If program B is adopted, there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved. Which of the two programs would you favor?
When Kahneman put this question to a few hundred physicians, seventy-two per cent chose option A, the safe-and-sure strategy. Most doctors would rather save a certain number of people for sure than risk the possibility that everyone might die.

Now consider this scenario:

The U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. If program C is adopted, 400 people will die. If program D is adopted, there is a one-third probability that nobody will die and a two-thirds probability that 600 people will die. Which of the two programs would you favor?
The two different hypotheticals, of course, examine identical dilemmas: saving one-third of the population is the same as losing two-thirds. And yet, doctors reacted very differently depending on how the question was framed. When the possible outcomes were stated in terms of deaths (and not survivors), physicians were suddenly eager to take chances: seventy-eight per cent chose option D.

Why are doctors so inconsistent? Kahneman and his longtime collaborator, Amos Tversky, explained these contradictory responses in terms of loss aversion, or the fact that losses hurt more than gains feel good. In fact, people hate losses so much that merely framing a choice in terms of a potential loss can shift their preferences. Like those physicians, people are suddenly willing to risk losing everything if thereâs a chance they might lose nothing.

Although our dislike of losses might seem obviousââYou need to have studied economics for many years before youâd be surprised by my research; it didnât shock my mother at all,â Kahneman saysâthe discovery of loss aversion proved to be an important refutation of human rationality. Unlike homo economicus, that imaginary species featured in macroeconomics textbooks, Kahneman and Tversky demonstrated that real people donât deal with uncertainty by carefully evaluating all of the relevant information. They stink at statistics and rarely maximize utility. Instead, their choices depend on a long list of mental short cuts and intemperate emotions, which often lead them to pick the wrong options.

Since the Israeli psychologists began studying loss aversion in the early nineteen-seventies, it has been used to explain a stunning variety of irrational behaviors, from the misguided decisions of investorsâthey refuse to sell losing stocksâto the stickiness of condo prices in the aftermath of a housing bubble. Itâs been used to justify our fondness for the status quoâthe present may stink, but we still donât want to lose itâand the cowardice of N.F.L. coaches, who are far too afraid to go for it on fourth down. Loss aversion even excuses our social habits: studies have shown that it generally takes at least five kind comments to compensate for a single criticism. (The ratios are even worse for criminals: a person convicted of murder must perform at least twenty-five acts of âlife-saving heroismâ before he is forgiven.) This is an impressive amount of explanatory firepower for a theory rooted in hypotheticals.

Itâs impossible to overstate the influence of Kahneman and Tversky. Like Darwin, they helped to dismantle a longstanding myth of human exceptionalism. Although weâd always seen ourselves as rational creaturesâthis was our Promethean giftâit turns out that human reason is rather feeble, easily overwhelmed by ancient instincts and lazy biases. The mind is a deeply flawed machine.

Nevertheless, there is a subtle optimism lurking in all of Kahnemanâs work: it is the hope that self-awareness is a form of salvation, that if we know about our mental mistakes, we can avoid them. One day, we will learn to equally weigh losses and gains; science can help us escape from the cycle of human error. As Kahneman and Tversky noted in the final sentence of their classic 1974 paper, âA better understanding of these heuristics and of the biases to which they lead could improve judgments and decisions in situations of uncertainty.â Unfortunately, such hopes appear to be unfounded. Self-knowledge isnât a cure for irrationality; even when we know why we stumble, we still find a way to fall.

Consider the story of Harry Markowitz, a Nobel Prize-winning economist who largely invented the field of investment-portfolio theory. By relying on a set of complicated equations, Markowitz was able to calculate the optimal mix of financial assets. (Due to loss-aversion, most investors hold too many low-risk bonds, but Markowitzâs work helped minimize the effect of the bias by mathematizing the decision.) Markowitz, however, was incapable of using his own research, at least when setting up his personal retirement fund. âI should have computed the historical co-variances of the asset classes and drawn an efficient frontier,â Markowitz later confessed. âInstead, I visualized my grief if the stock market â¦ went way down and I was completely in it. My intention was to minimize my future regret. So I split my contributions 50/50 between bonds and equities.â

Football coaches have performed just as badly. Although itâs now clear that their biases have a meaningful impactâa coach immune to loss aversion would win one more game in three seasons out of every fourâtheir collective decision-making hasnât improved.

This same theme applies to practically all of our thinking errors: self-knowledge is surprisingly useless. Teaching people about the hazards of multitasking doesnât lead to less texting in the car; learning about the weakness of the will doesnât increase the success of diets; knowing that most people are overconfident about the future doesnât make us more realistic. (Weâre a bundle of contradictions. Kahneman has also studied the overconfidence bias: The vast majority of entrepreneurs believe they will be in business in five years even though sixty-five per cent fold.) The problem isnât that weâre stupidâitâs that weâre so damn stubborn.

Kahneman, of course, knows all this. One of the most refreshing things about âThinking, Fast and Slowâ is his deep sense of modesty: he is that rare guru who doesnât promise to change your life. In fact, Kahneman admits that his decades of groundbreaking research have failed to significantly improve his own mental performance. âMy intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacyââa tendency to underestimate how long it will take to complete a taskââas it was before I made a study of these issues,â he writes. As a result, his goals for his work are charmingly narrow: he merely hopes to âenrich the vocabulary that people useâ when they talk about the mind.

This new book will certainly accomplish thatâKahneman has given us a new set of labels for our shortcomings. But his greatest legacy, perhaps, is also his bleakest: By categorizing our cognitive flaws, documenting not just our errors but also their embarrassing predictability, he has revealed the hollowness of a very ancient aspiration. Knowing thyself is not enough. Not even close.
newyorker.comby JONAH LEHRER â¢ OCT. 25, 2011

2. ### nutmeg

The U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people.

Two programs to combat the disease have been proposed.

Program A is: Do I have enough Merck in my portfolio?

Program B is: How will this effect the Asian vote?

Know thyself? More like Know thy Congressman.

3. ### Roark

Nobel prizes are way overrated. They don't seem to mean jack anymore. They're more like a mark of idiocy then genius.