Kahneman Grants that “the Basic Concepts of Economics Are Essential Intellectual Tools”

(p. 286) Most graduate students in economics have heard about prospect theory and loss aversion, but you are unlikely to find these terms in the index of an introductory text in economics. I am sometimes pained by this omission, but in fact it is quite reasonable, because of the central role of rationality in basic economic theory. The standard concepts and results that undergraduates are taught are most easily explained by assuming that Econs do not make foolish mistakes. This assumption is truly necessary, and it would be undermined by introducing the Humans of prospect theory, whose evaluations of outcomes are unreasonably short-sighted.
There are good reasons for keeping prospect theory out of introductory texts. The basic concepts of economics are essential intellectual tools, which are not easy to grasp even with simplified and unrealistic assumptions about the nature of the economic agents who interact in markets. Raising questions about these assumptions even as they are introduced would be confusing, and perhaps demoralizing. It is reasonable to put priority on helping students acquire the basic tools of the discipline. Furthermore, the failure of rationality that is built into prospect theory is often irrelevant to the predictions of economic theory, which work out with great precision in some situations and provide good approximations in many others. In some contexts, however, the difference becomes significant: the Humans described by prospect theory are (p. 287) guided by the immediate emotional impact of gains and losses, not by long-term prospects of wealth and global utility.
I emphasized theory-induced blindness in my discussion of flaws in Bernoulli’s model that remained unquestioned for more than two centuries. But of course theory-induced blindness is not restricted to expected utility theory. Prospect theory has flaws of its own, and theory-induced blindness to these flaws has contributed to its acceptance as the main alternative to utility theory.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Sticking with Expected Utility Theory as an Example of “Theory-Induced Blindness”

(p. 286) Perhaps carried away by their enthusiasm, [Rabin and Thaler] . . . concluded their article by recalling the famous Monty Python sketch in which a frustrated customer attempts to return a dead parrot to a pet store. The customer uses a long series of phrases to describe the state of the bird, culminating in “this is an ex-parrot.” Rabin and Thaler went on to say that “it is time for economists to recognize that expected utility is an ex-hypothesis.” Many economists saw this flippant statement as little short of blasphemy. However, the theory-induced blindness of accepting the utility of wealth as an explanation of attitudes to small losses is a legitimate target for humorous comment.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: bracketed names and ellipsis added.)

A Marshmallow Now or an Elegant French Pastry Four Years Later

HowChildrenSucceedBK2012-08-31.jpg

Source of book image: http://images.amazon.com/images/G/01/richmedia/images/cover.gif

(p. 19) Growing up in the erratic care of a feckless single mother, “Kewauna seemed able to ignore the day-to-day indignities of life in poverty on the South Side and instead stay focused on her vision of a more successful future.” Kewauna tells Tough, “I always wanted to be one of those business ladies walking downtown with my briefcase, everybody saying, ‘Hi, Miss Lerma!’ “

Here, as throughout the book, Tough nimbly combines his own reporting with the findings of scientists. He describes, for example, the famous “marshmallow experiment” of the psychologist Walter Mischel, whose studies, starting in the late 1960s, found that children who mustered the self-control to resist eating a marshmallow right away in return for two marshmallows later on did better in school and were more successful as adults.
“What was most remarkable to me about Kewauna was that she was able to marshal her prodigious noncognitive capacity — call it grit, conscientiousness, resilience or the ability to delay gratification — all for a distant prize that was, for her, almost entirely theoretical,” Tough observes of his young subject, who gets into college and works hard once she’s there. “She didn’t actually know any business ladies with briefcases downtown; she didn’t even know any college graduates except her teachers. It was as if Kewauna were taking part in an extended, high-stakes version of Walter Mischel’s marshmallow experiment, except in this case, the choice on offer was that she could have one marshmallow now or she could work really hard for four years, constantly scrimping and saving, staying up all night, struggling, sacrificing — and then get, not two marshmallows, but some kind of elegant French pastry she’d only vaguely heard of, like a napoleon. And Kewauna, miraculously, opted for the napoleon, even though she’d never tasted one before and didn’t know anyone who had. She just had faith that it was going to be delicious.”

For the full review, see:
ANNIE MURPHY PAUL. “School of Hard Knocks.” The New York Times Book Review (Sun., August 26, 2012): 19.
(Note: the online version of the article is dated August 23, 2012.)

The full reference for the book under review, is:
Tough, Paul. How Children Succeed: Grit, Curiosity, and the Hidden Power of Character. Boston, MA: Houghton Mifflin Harcourt, 2012.

“Theory-Induced Blindness”

(p. 276) The mystery is how a conception of the utility of outcomes that is vulnerable to . . . obvious counterexamples survived for so long. I can explain (p. 277) it only by a weakness of the scholarly mind that I have often observed in myself. I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws. If you come upon an observation that does not seem to fit the model, you assume that there must be a perfectly good explanation that you are somehow missing. You give the theory the benefit of the doubt, trusting the community of experts who have accepted it. . . . As the psychologist Daniel Gilbert observed, disbelieving is hard work, and System 2 is easily tired.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: ellipses added.)

Premortem Reduces Bias from Uncritical Optimism

(p. 265) As a team converges on a decision–and especially when the leader tips her hand–public doubts about the wisdom of the planned move are gradually suppressed and eventually come to be treated as evidence of flawed loyalty to the team and its leaders. The suppression of doubt contributes to overconfidence in a group where only supporters of the decision have a voice. The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier. The premortem is not a panacea and does not provide complete protection against nasty surprises, but it goes some way toward reducing the damage of plans that are subject to the biases of WYSIATI and uncritical optimism.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

People “Reward the Providers of Dangerously Misleading Information”

(p. 262) As Nassim Taleb has argued, inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid. However, optimism is highly valued, socially and in the market; people and firms reward the providers of dangerously misleading information more than they reward truth tellers. One of the lessons of the financial crisis that led to the Great Recession is that there are periods in which competition, among experts and among organizations, creates powerful forces that favor a collective blindness to risk and uncertainty.
The social and economic pressures that favor overconfidence are not (p. 263) restricted to financial forecasting. Other professionals must deal with the fact that an expert worthy of the name is expected to display high confidence. Philip Tetlock observed that the most overconfident experts were the most likely to be invited to strut their stuff in news shows. Overconfidence also appears to be endemic in medicine. A study of patients who died in the ICU compared autopsy results with the diagnosis that physicians had provided while the patients were still alive. Physicians also reported their confidence. The result: “clinicians who were ‘completely certain’ of the diagnosis antemortem were wrong 40% of the time.” Here again, expert overconfidence is encouraged by their clients: “Generally, it is considered a weakness and a sign of vulnerability for clinicians to appear unsure. Confidence is valued over uncertainty and there is a prevailing censure against disclosing uncertainty to patients.” Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients. An unbiased appreciation of uncertainty is a cornerstone of rationality–but it is not what people and organizations want. Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred solution.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Big Firm CFOs Were Confident about Their “Worthless” Stock Forecasts

(p. 261) For a number of years, professors at Duke University conducted a survey in which the chief financial officers of large corporations estimated the returns of the Standard & Poor’s index over the following year. The Duke scholars collected 11,600 such forecasts and examined their accuracy. The conclusion was straightforward: financial officers of large corporations had no clue about the short-term future of the stock market; the correlation between their estimates and the true value was slightly less than zero! When they said the market would go down, it was slightly more likely than not that it would go up. These findings are not surprising. The truly bad news is that the CFOs did not appear to know that their forecasts were worthless.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Failed Entrepreneurial Firms that Signal New Markets Are “Optimistic Martyrs”

(p. 260) Colin Camerer and Dan Lovallo, who coined the concept of competition neglect, illustrated it with a quote from the then chairman of Disney Studios. Asked why so many expensive big-budget movies are released on the same days (such as Memorial Day and Independence Day), he replied: Hubris. Hubris. If you only think about your own business, you think, “I’ve got a good story department, I’ve got a good marketing department, we’re (p. 261) going to go out and do this.” And you don’t think that everybody else is thinking the same way. In a given weekend in a year you’ll have five movies open, and there’s certainly not enough people to go around.
The candid answer refers to hubris, but it displays no arrogance, no conceit of superiority to competing studios. The competition is simply not part of the decision, in which a difficult question has again been replaced by an easier one. The question that needs an answer is this: Considering what others will do, how many people will see our film? The question the studio executives considered is simpler and refers to knowledge that is most easily available to them: Do we have a good film and a good organization to market it? The familiar System 1 processes of WYSIATI and substitution produce both competition neglect and the above-average effect. The consequence of competition neglect is excess entry: more competitors enter the market than the market can profitably sustain, so their average outcome is a loss. The outcome is disappointing for the typical entrant in the market, but the effect on the economy as a whole could well be positive. In fact, Giovanni Dosi and Dan Lovallo call entrepreneurial firms that fail but signal new markets to more qualified competitors “optimistic martyrs”– good for the economy but bad for their investors.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Overly Optimistic Entrepreneurs Seek Government Support for Projects that Will Usually Fail

People have a right to be overly-optimistic when they invest their own money in entrepreneurial projects. But governments should be prudent caretakers of the money they have taken from taxpayers. The overly-optimistic bias of subsidy-seeking entrepreneurs weakens the case for government support of entrepreneurial projects.

(p. 259) The optimistic risk taking of entrepreneurs surely contributes to the economic dynamism of a capitalistic society, even if most risk takers end up disappointed. However, Marta Coelho of the London School of Economics has pointed out the difficult policy issues that arise when founders of small businesses ask the government to support them in decisions that are most likely to end badly. Should the government provide loans to would-be entrepreneurs who probably will bankrupt themselves in a few years? Many behavioral economists are comfortable with the “libertarian paternalistic” procedures that help people increase their savings rate beyond what they would do on their own. The question of whether and how government should support small business does not have an equally satisfying answer.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

For Inventors “Optimism Is Widespread, Stubborn, and Costly”

(p. 257) One of the benefits of an optimistic temperament is that it encourages persistence in the face of obstacles. But persistence can be costly. An impressive series of studies by Thomas Åstebro sheds light on what happens when optimists receive bad news. He drew his data from a Canadian organization–the Inventors Assistance Program–which collects a small fee to provide inventors with an objective assessment of the commercial prospects of their idea. The evaluations rely on careful ratings of each invention on 37 criteria, including need for the product, cost of production, and estimated trend of demand. The analysts summarize their ratings by a letter grade, where D and E predict failure–a prediction made for over 70% of the inventions they review. The forecasts of failure are remarkably accurate: only 5 of 411 projects that were given the lowest grade reached commercialization, and none was successful.
Discouraging news led about half of the inventors to quit after receiving a grade that unequivocally predicted failure. However, 47% of them continued development efforts even after being told that their project was hopeless, and on average these persistent (or obstinate) individuals doubled their initial losses before giving up. Significantly, persistence after discouraging advice was relatively common among inventors who had a high score on a personality measure of optimism–on which inventors generally scored higher than the general population. Overall, the return on private invention was small, “lower than the return on private equity and on high-risk securities.” More generally, the financial benefits of self-employment are mediocre: given the same qualifications, people achieve higher average returns by selling their skills to employers than by setting out on their own. The evidence suggests that optimism is widespread, stubborn, and costly.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Entrepreneurs Are Optimistic About the Odds of Success

(p. 256) The chances that a small business will survive for five years in the United States are about 35%. But the individuals who open such businesses do not believe that the statistics apply to them. A survey found that American entrepreneurs tend to believe they are in a promising line of business: their (p. 257) average estimate of the chances of success for “any business like yours” was 60%–almost double the true value. The bias was more glaring when people assessed the odds of their own venture. Fully 81% of the entrepreneurs put their personal odds of success at 7 out of 10 or higher, and 33% said their chance of failing was zero.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.