Kahneman Preaches that People Can and Should Act More Rationally

(p. 338) . . . I have a sermon ready for Sam if he rejects the offer of a single highly favorable gamble played once, and for you if you share his unreason-able aversion to losses:

I sympathize with your aversion to losing any gamble, but it is costing you a lot of money. Please consider this question: Are you on your deathbed? Is this the last offer of a small favorable gamble that you will ever consider? Of course, you are unlikely to be offered exactly this gamble again, but you will have many opportunities to consider attractive gambles with stakes that are very small relative to your wealth. You will do yourself a large financial favor if you are able to see each of these gambles as part of a bundle of small gambles and rehearse the mantra that will get you significantly closer to economic rationality: you win a few, you lose a few. The main purpose of the mantra is to control your emotional response when you do lose. If you can trust it to be effective, you should remind yourself of it when deciding whether or not to accept a small risk with positive expected value. Remember these qualifications when using the mantra:

  • It works when the gambles are genuinely independent of each other; it does not apply to multiple investments in the same industry, which would all go bad together.

(p. 339)

  • It works only when the possible loss does not cause you to worry about your total wealth. If you would take the loss as significant bad news about your economic future, watch it!
  • It should not be applied to long shots, where the probability of winning is very small for each bet.

If you have the emotional discipline that this rule requires, you will never consider a small gamble in isolation or be loss averse for a small gamble until you are actually on your deathbed and not even then.

This advice is not impossible to follow. Experienced traders in financial markets live by it every day, shielding themselves from the pain of losses by broad framing. As was mentioned earlier, we now know that experimental subjects could be almost cured of their loss aversion (in a particular context) by inducing them to “think like a trader,” just as experienced baseball card traders are not as susceptible to the endowment effect as novices are. Students made risky decisions (to accept or reject gambles in which they could lose) under different instructions. In the narrow-framing condition, they were told to “make each decision as if it were the only one” and to accept their emotions. The instructions for broad framing of a decision included the phrases “imagine yourself as a trader,” “you do this all the time,” and “treat it as one of many monetary decisions, which will sum together to produce a ‘portfolio’.” The experimenters assessed the subjects’ emotional response to gains and losses by physiological measures, including changes in the electrical conductance of the skin that are used in lie detection. As expected, broad framing blunted the emotional reaction to losses and increased the willingness to take risks.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: ellipsis added; italics in original.)

How a Group of “Natural Philosophers” Created Science in a London “Full of Thieves, Murderers and Human Waste”

clockworkuniverseBK2012-09-01.jpg

Source of book image: http://www.edwarddolnick.net/images/clockworkuniverse-cover.jpg

(p. 19) London before the mid-1600s was a general calamity. The streets were full of thieves, murderers and human waste. Death was everywhere: doctors were hapless, adults lived to about age 30, children died like flies. In 1665, plague moved into the city, killing sometimes 6,000 people a week. In 1666, an unstoppable fire burned the city to the ground; the bells of St. Paul’s melted. Londoners thought that the terrible voice of God was “roaring in the City,” one witness wrote, and they would do best to accept the horror, calculate their sins, pray for guidance and await retribution.

In the midst of it all, a group of men whose names we still learn in school formed the Royal Society of London for the Improvement of Natural Knowledge. They thought that God, while an unforgiving judge, was also a mathematician. As such, he had organized the universe according to discernible, mathematical law, which, if they tried, they could figure out. They called themselves “natural philosophers,” and their motto was “Nullius in verba”: roughly, take no one’s word for anything. You have an idea? Demonstrate it, do an experiment, prove it. The ideas behind the Royal Society would flower into the Enlightenment, the political, cultural, scientific and educational revolution that gave rise to the modern West.
This little history begins Edward Dolnick’s “Clockwork Universe,” so the reader might think the book is about the Royal Society and its effects. But the Royal Society is dispatched in the first third of the book, and thereafter, the subject is how the attempt to find the mathematics governing the universe played out in the life of Isaac Newton.
. . .
To go from sinful “curiositas” to productive “curiosity,” from blind acceptance to open-eyed inquiry, from asking, “Why?” to answering, “How?” — this change, of all the world’s revolutions, must surely be the most remarkable.

For the full review, see:
ANN FINKBEINER. “Masters of the Universe.” The New York Times Book Review (Sun., March 27, 2011): 19.
(Note: the online version of the review has the date March 25, 2011, and had the title “What Newton Gave Us.”)

The full reference for the book under review, is:
Dolnick, Edward. The Clockwork Universe: Isaac Newton, the Royal Society, and the Birth of the Modern World. New York: HarperCollins Publishers, 2011.

Reference Point Ignored Due to “Theory-Induced Blindness”

(p. 290) The omission of the reference point from the indifference map is a surprising case of theory-induced blindness, because we so often encounter cases in which the reference point obviously matters. In labor negotiations, it is well understood by both sides that the reference point is the existing contract and that the negotiations will focus on mutual demands for concessions relative to that reference point. The role of loss aversion in bargaining is also well understood: making concessions hurts. You have much (p. 291) personal experience of the role of reference point. If you changed jobs or locations, or even considered such a change, you surely remember that the features of the new place were coded as pluses or minuses relative to where you were. You may also have noticed that disadvantages loomed larger than advantages in this evaluation–loss aversion was at work. It is difficult to accept changes for the worse. For example, the minimal wage that unemployed workers would accept for new employment averages 90% of their previous wage, and it drops by less than 10% over a period of one year.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

“Science Is Weakest in the Lands of Islam”

HouseOfWisdomBK2012-09-01.jpg

Source of book image: http://photo.goodreads.com/books/1327925578l/10379376.jpg

(p. 18) The upshot was, while the Greek works in particular were disappearing in Europe, they were being preserved in Arabic to be retranslated later into Latin for a rebirth of “lost” knowledge. This is one half of the point the author makes frequently in the text and, in boldface, as the book’s subtitle.

The other half is that contrary to some doubters, the Arab interest in learning extended well beyond translations: thinkers working alone or in observatories and houses of wisdom were conducting original research during “the world’s most impressive period of scholarship and learning since ancient Greece.” Accordingly, al-Khalili writes that ­al-Mamun stands as “the greatest patron of science in the cavalcade of Islamic rulers.”
Sometimes al-Khalili, like a lawyer who suspects a jury of unyielding skepticism, strains to give stature to the leading lights of Arabic science in the Middle Ages. But modern historians of science agree that more attention should be given to the Arab contribution to the preservation and expansion of knowledge at this critical period, and the author has done so in considerable detail and with rising passion.
But that was then, and al-Khalili is obligated to end on an inescapable but deflating note: science today is in a chronic state of neglect in the Arab world and the broader Islamic culture of more than one billion people. Al-Khalili spreads the blame widely, citing inadequate financing for research and education, sclerotic bureaucracies, religious conservatism, even an ingrained fear of science. The Pakistani physicist Abdus Salam, perhaps the greatest Muslim scientist of the last century, won a Nobel Prize in 1979 and did what he could to promote a scientific renaissance among his people, without success. “Of all civilizations on this planet, science is weakest in the lands of Islam,” Salam said in despair. “The dangers of this weakness cannot be overemphasized since the honorable survival of a society depends directly on its science and technology in the condition of the present age.”
By recounting Arabic science’s luminous past, al-Khalili says, he hopes to instill a sense of pride that will “propel the importance of scientific enquiry back to where it belongs: at the very heart of what defines a civilized and enlightened society.”

For the full review, see:
JOHN NOBLE WILFORD. “The Muslim Art of Science.” The New York Times Book Review (Sun., May 22, 2011): 18.
(Note: the online version of the review has the date May 20, 2011.)

The full reference for the book under review, is:
al-Khalili, Jim. The House of Wisdom: How Arabic Science Saved Ancient Knowledge and Gave Us the Renaissance. New York: The Penguin Press, 2010.

Kahneman Grants that “the Basic Concepts of Economics Are Essential Intellectual Tools”

(p. 286) Most graduate students in economics have heard about prospect theory and loss aversion, but you are unlikely to find these terms in the index of an introductory text in economics. I am sometimes pained by this omission, but in fact it is quite reasonable, because of the central role of rationality in basic economic theory. The standard concepts and results that undergraduates are taught are most easily explained by assuming that Econs do not make foolish mistakes. This assumption is truly necessary, and it would be undermined by introducing the Humans of prospect theory, whose evaluations of outcomes are unreasonably short-sighted.
There are good reasons for keeping prospect theory out of introductory texts. The basic concepts of economics are essential intellectual tools, which are not easy to grasp even with simplified and unrealistic assumptions about the nature of the economic agents who interact in markets. Raising questions about these assumptions even as they are introduced would be confusing, and perhaps demoralizing. It is reasonable to put priority on helping students acquire the basic tools of the discipline. Furthermore, the failure of rationality that is built into prospect theory is often irrelevant to the predictions of economic theory, which work out with great precision in some situations and provide good approximations in many others. In some contexts, however, the difference becomes significant: the Humans described by prospect theory are (p. 287) guided by the immediate emotional impact of gains and losses, not by long-term prospects of wealth and global utility.
I emphasized theory-induced blindness in my discussion of flaws in Bernoulli’s model that remained unquestioned for more than two centuries. But of course theory-induced blindness is not restricted to expected utility theory. Prospect theory has flaws of its own, and theory-induced blindness to these flaws has contributed to its acceptance as the main alternative to utility theory.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Models Often “Ignore the Messiness of Reality”

SuperCooperatorsBK2012-08-31.png

Source of book image: http://www.namingandtreating.com/wp-content/uploads/2011/04/SuperCooperators_small.png

(p. 18) Nowak is one of the most exciting modelers working in the field of mathematical biology today. But a model, of course, is only as good as its assumptions, and biology is much messier than physics or chemistry. Nowak tells a joke about a man who approaches a shepherd and asks, ”If I tell you how many sheep you have, can I have one?” The shepherd agrees and is astonished when the stranger answers, ”Eighty-three.” As he turns to leave, the shepherd retorts: ”If I guess your profession, can I have the animal back?” The stranger agrees. ”You must be a mathematical biologist.” How did he know? ”Because you picked up my dog.”

. . .
Near the end of the book, Nowak describes Gustav Mahler’s efforts, in his grandiloquent Third Symphony, to create an all-encompassing structure in which ”nature in its totality may ring and resound,” adding, ”In my own way, I would like to think I have helped to give nature her voice too.” But there remains a telling gap between the precision of the models and the generality of the advice Nowak offers for turning us all into supercooperators. We humans really are infinitely more complex than falling apples, metastasizing colons, even ant colonies. Idealized accounts of the world often need to ignore the messiness of reality. Mahler understood this. In 1896 he invited Bruno Walter to Lake Attersee to glimpse the score of the Third. As they walked beneath the mountains, Walter admonished Mahler to look at the vista, to which he replied, ”No use staring up there — I’ve already composed it all away into my symphony!”

For the full review, see:
OREN HARMAN. “A Little Help from Your Friends.” The New York Times Book Review (Sun., April 10, 2011): 18.
(Note: ellipsis added.)
(Note: the online version of the review has the date April 8, 2011, and has the title “How Evolution Explains Altruism.”)

The full reference for the book under review, is:
Nowak, Martin A., and Roger Highfield. Supercooperators: Altruism, Evolution, and Why We Need Each Other to Succeed. New York: Free Press, 2011.

Sticking with Expected Utility Theory as an Example of “Theory-Induced Blindness”

(p. 286) Perhaps carried away by their enthusiasm, [Rabin and Thaler] . . . concluded their article by recalling the famous Monty Python sketch in which a frustrated customer attempts to return a dead parrot to a pet store. The customer uses a long series of phrases to describe the state of the bird, culminating in “this is an ex-parrot.” Rabin and Thaler went on to say that “it is time for economists to recognize that expected utility is an ex-hypothesis.” Many economists saw this flippant statement as little short of blasphemy. However, the theory-induced blindness of accepting the utility of wealth as an explanation of attitudes to small losses is a legitimate target for humorous comment.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: bracketed names and ellipsis added.)

A Marshmallow Now or an Elegant French Pastry Four Years Later

HowChildrenSucceedBK2012-08-31.jpg

Source of book image: http://images.amazon.com/images/G/01/richmedia/images/cover.gif

(p. 19) Growing up in the erratic care of a feckless single mother, “Kewauna seemed able to ignore the day-to-day indignities of life in poverty on the South Side and instead stay focused on her vision of a more successful future.” Kewauna tells Tough, “I always wanted to be one of those business ladies walking downtown with my briefcase, everybody saying, ‘Hi, Miss Lerma!’ “

Here, as throughout the book, Tough nimbly combines his own reporting with the findings of scientists. He describes, for example, the famous “marshmallow experiment” of the psychologist Walter Mischel, whose studies, starting in the late 1960s, found that children who mustered the self-control to resist eating a marshmallow right away in return for two marshmallows later on did better in school and were more successful as adults.
“What was most remarkable to me about Kewauna was that she was able to marshal her prodigious noncognitive capacity — call it grit, conscientiousness, resilience or the ability to delay gratification — all for a distant prize that was, for her, almost entirely theoretical,” Tough observes of his young subject, who gets into college and works hard once she’s there. “She didn’t actually know any business ladies with briefcases downtown; she didn’t even know any college graduates except her teachers. It was as if Kewauna were taking part in an extended, high-stakes version of Walter Mischel’s marshmallow experiment, except in this case, the choice on offer was that she could have one marshmallow now or she could work really hard for four years, constantly scrimping and saving, staying up all night, struggling, sacrificing — and then get, not two marshmallows, but some kind of elegant French pastry she’d only vaguely heard of, like a napoleon. And Kewauna, miraculously, opted for the napoleon, even though she’d never tasted one before and didn’t know anyone who had. She just had faith that it was going to be delicious.”

For the full review, see:
ANNIE MURPHY PAUL. “School of Hard Knocks.” The New York Times Book Review (Sun., August 26, 2012): 19.
(Note: the online version of the article is dated August 23, 2012.)

The full reference for the book under review, is:
Tough, Paul. How Children Succeed: Grit, Curiosity, and the Hidden Power of Character. Boston, MA: Houghton Mifflin Harcourt, 2012.

“Theory-Induced Blindness”

(p. 276) The mystery is how a conception of the utility of outcomes that is vulnerable to . . . obvious counterexamples survived for so long. I can explain (p. 277) it only by a weakness of the scholarly mind that I have often observed in myself. I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws. If you come upon an observation that does not seem to fit the model, you assume that there must be a perfectly good explanation that you are somehow missing. You give the theory the benefit of the doubt, trusting the community of experts who have accepted it. . . . As the psychologist Daniel Gilbert observed, disbelieving is hard work, and System 2 is easily tired.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: ellipses added.)

Premortem Reduces Bias from Uncritical Optimism

(p. 265) As a team converges on a decision–and especially when the leader tips her hand–public doubts about the wisdom of the planned move are gradually suppressed and eventually come to be treated as evidence of flawed loyalty to the team and its leaders. The suppression of doubt contributes to overconfidence in a group where only supporters of the decision have a voice. The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier. The premortem is not a panacea and does not provide complete protection against nasty surprises, but it goes some way toward reducing the damage of plans that are subject to the biases of WYSIATI and uncritical optimism.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Economists Optimistic that Economy Can Adapt to Climate Change

EconomicsOfClimateChangeBK2012-08-28.jpg

Source of book image: http://www.bibliovault.org/thumbs/978-0-226-47988-0-frontcover.jpg

(p. 222) Efficient policy decisions regarding climate change require credible estimates of the future costs of possible (in)action. The edited volume by Gary Libecap and Richard Steckel contributes to this important policy discussion by presenting work estimating the ability of economic actors to adapt to a changing climate. The eleven contributed research chapters primarily focus on the historical experience of the United States and largely on the agricultural sector. While the conclusions are not unanimous, on average, the authors tend to present an optimistic perspective on the ability of the economy to adapt to climate change.

For the full review, see:
Swoboda, Aaron. “Review of: The Economics of Climate Change: Adaptations Past and Present.” Journal of Economic Literature 50, no. 1 (March 2012): 222-24.

Book under review:
Libecap, Gary D., and Richard H. Steckel, eds. The Economics of Climate Change: Adaptations Past and Present, National Bureau of Economic Research Conference Report. Chicago: University of Chicago Press, 2011.