Mackay Warned about Delusions, then Was Deluded by Bubble

(p. B1) Can you spot a bubble?
Ever since 1841, when a Scottish journalist named Charles Mackay published the book known today as “Extraordinary Popular Delusions and the Madness of Crowds,” the answer has seemed clear. If you watch carefully for signs of euphoria, you can sidestep the damage when markets go mad.
But bubble spotting isn’t as simple as Mackay made it sound–even, it turns out, for Mackay himself. Investors should always guard against the glib assertions of pundits who claim they can detect bubbles before they burst.
. . .
But new research tells the untold tale of Mackay’s own behavior in the face of a bubble–and it is a shocker. A mathematician and former cryptographer at Bell Labs named Andrew Odlyzko has spent much of the past decade researching a forgotten stock mania. One of its biggest boosters was none other than Charles Mackay.
A bubble in British railroad stocks began in 1844, only three years after Mackay published his book, and it didn’t start to collapse until late 1845. Even with the history of market folly fresh in his mind, Mackay urged British investors to pile into railway stocks, whose extravagant prices were based on absurdly unrealistic projections of future growth.
The most famous critic of bubbles who ever lived fell like a chump for a craze that was unfolding before his very eyes. On Oct. 2, 1845, Mackay wrote that “those who sound the alarm of an approaching railway crisis have somewhat exaggerated the danger.”
He went on to ridicule anyone who argued that “the Railway mania of the present day” was similar to the devastating bubbles he had described in his own book. “There is no reason whatever to fear” a crash, he concluded.
He couldn’t have been more wrong. From 1845 to the bottom in 1850, railway stocks fell by two-thirds–the equivalent of roughly $1 trillion of losses in today’s money. Mackay never fessed up to his own extraordinary delusion.

For the full commentary, see:
JASON ZWEIG. “THE INTELLIGENT INVESTOR; The Extraordinary Popular Delusion of Bubble Spotting.” The Wall Street Journal (Sat., NOVEMBER 5, 2011): B1.
(Note: ellipsis added.)

Fantasizing about Achieving Goals Has Opportunity Cost in Terms of Energy to Actually Achieve Goals

(p. C4) Fantasizing about achieving goals can make people less likely to achieve them, by sapping the energy required to do the necessary work, a study finds.
. . .
The researchers concluded: “Positive fantasies will sap job-seekers of the energy to pound the pavement, and drain the lovelorn of the energy to approach the one they like.”

For the full story, see:
Christopher Shea. “Week in Ideas; Psychology; Lost in Fantasy.” The Wall Street Journal (Sat., JUNE 4, 2011): C4.
(Note: ellipsis added.)

The article summarized is:
Kappes, Heather Barry, and Gabriele Oettingen. “Positive Fantasies About Idealized Futures Sap Energy.” Journal of Experimental Social Psychology 47 (2011): 719-29.

The Costs of Altruism

PathologicalAltruismBK.jpg

Source of book image: http://www.barbaraoakley.com/_font_face__book_antiqua___font_size__3___i__b_pathological_altruism__i___b__106998.htm

(p. D1) On entering the patient’s room with spinal tap tray portentously agleam, Dr. Burton encountered the patient’s family members. They begged him not to proceed. The frail, bedridden patient begged him not to proceed. Dr. Burton conveyed their pleas to the oncologist, but the oncologist continued to lobby for a spinal tap, and the exhausted family finally gave in.
. . .
(p. D2) . . . , Dr. Burton is a contributor to a scholarly yet surprisingly sprightly volume called “Pathological Altruism,” to be published this fall by Oxford University Press. . . .
As the new book makes clear, pathological altruism is not limited to showcase acts of self-sacrifice, like donating a kidney or a part of one’s liver to a total stranger. The book is the first comprehensive treatment of the idea that when ostensibly generous “how can I help you?” behavior is taken to extremes, misapplied or stridently rhapsodized, it can become unhelpful, unproductive and even destructive.
. . .
David Brin, a physicist and science fiction writer, argues in one chapter that sanctimony can be as physically addictive as any recreational drug, and as destabilizing. “A relentless addiction to indignation may be one of the chief drivers of obstinate dogmatism,” he writes. . . .
Barbara Oakley, an associate professor of engineering at Oakland University in Michigan and an editor of the new volume, said in an interview that when she first began talking about its theme at medical or social science conferences, “people looked at me as though I’d just grown goat horns. They said, ‘But altruism by definition can never be pathological.’ ”
To Dr. Oakley, the resistance was telling. “It epitomized the idea ‘I know how to do the right thing, and when I decide to do the right thing it can never be called pathological,’ ” she said.
. . .
Yet given her professional background, Dr. Oakley couldn’t help doubting altruism’s exalted reputation. “I’m not looking at altruism as a sacred thing from on high,” she said. “I’m looking at it as an engineer.”

For the full story, see:
NATALIE ANGIER. “BASICS; The Pathological Altruist Gives Till Someone Hurts.” The New York Times (Tues.,October 4, 2011): D1 & D2.
(Note: ellipses added.)
(Note: the online version of the article is dated October 3, 2011.)

Confirmation Bias (aka “Pigheadedness”) in Science

(p. 12) In a classic psychology experiment, people for and against the death penalty were asked to evaluate the different research designs of two studies of its deterrent effect on crime. One study showed that the death penalty was an effective deterrent; the other showed that it was not. Which of the two research designs the participants deemed the most scientifically valid depended mostly on whether the study supported their views on the death penalty.
In the laboratory, this is labeled confirmation bias; observed in the real world, it’s known as pigheadedness.
Scientists are not immune. In another experiment, psychologists were asked to review a paper submitted for journal publication in their field. They rated the paper’s methodology, data presentation and scientific contribution significantly more favorably when the paper happened to offer results consistent with their own theoretical stance. Identical research methods prompted a very different response in those whose scientific opinion was challenged.

For the full commentary, see:
CORDELIA FINE. “GRAY MATTER; Biased but Brilliant.” The New York Times, SundayReview Section (Sun., July 31, 2011): 12.
(Note: the online version of the article is dated July 30, 2011.)

Another Nod to Planck’s “Cynical View of Science”

The Max Planck view expressed in the quote below, has been called “Planck’s Principle” and has been empirically tested in three papers cited at the end of the entry.

(p. 12) How’s this for a cynical view of science? “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Scientific truth, according to this view, is established less by the noble use of reason than by the stubborn exertion of will. One hopes that the Nobel Prize-winning physicist Max Planck, the author of the quotation above, was writing in an unusually dark moment.
And yet a large body of psychological data supports Planck’s view: we humans quickly develop an irrational loyalty to our beliefs, and work hard to find evidence that supports those opinions and to discredit, discount or avoid information that does not.

For the full commentary, see:
CORDELIA FINE. “GRAY MATTER; Biased but Brilliant.” The New York Times, SundayReview Section (Sun., July 31, 2011): 12.
(Note: ellipses added.)
(Note: the online version of the article is dated July 30, 2011.)

Three of my papers that present evidence on Planck’s Principle, are:
“Age and the Acceptance of Cliometrics.” The Journal of Economic History 40, no. 4 (December 1980): 838-841.
“Planck’s Principle: Do Younger Scientists Accept New Scientific Ideas with Greater Alacrity than Older Scientists?” Science 202 (November 17, 1978): 717-723 (with David L. Hull and Peter D. Tessner).
“The Polywater Episode and the Appraisal of Theories.” In A. Donovan, L. Laudan and R. Laudan, eds., Scrutinizing Science: Empirical Studies of Scientific Change. Dordrecht, Holland: Kluwer Academic Publishers, 1988, 181-198.

Neuroscientist Sees Entrepreneurs as “Never Satisfied” Due to “Attenuated Dopamine Function”

Compass-of-Pleasure-BK.jpg

Source of book image: http://www.kurzweilai.net/images/The-Compass-of-Pleasure-Linden-David-J-9780670022588.jpg

David J. Linden is the author of The Compass of Pleasure and a Johns Hopkins University School of Medicine Professor of Neuroscience.

(p. 4) . . . , the psychological profile of a compelling leader — think of tech pioneers like Jeff Bezos, Larry Ellison and Steven P. Jobs — is also that of the compulsive risk-taker, someone with a high degree of novelty-seeking behavior. In short, what we seek in leaders is often the same kind of personality type that is found in addicts, whether they are dependent on gambling, alcohol, sex or drugs.

How can this be? We typically see addicts as weak-willed losers, and chief executives and entrepreneurs are people with discipline and fortitude. To understand this apparent contradiction we need to look under the hood of the brain, and in particular at the functions that relate to pleasure and reward.
. . .
Crucially, genetic variants that suppress dopamine signaling in the pleasure circuit substantially increase pleasure- and novelty-seeking behaviors — their bearers must seek high levels of stimulation to reach the same level of pleasure that others can achieve with more moderate indulgence. Those blunted dopamine receptor variants are associated with substantially increased risk of addiction to a range of substances and behaviors.
. . .
The risk-taking, novelty-seeking and obsessive personality traits often found in addicts can be harnessed to make them very effective in the workplace. For many leaders, it’s not the case that they succeed in spite of their addiction; rather, the same brain wiring and chemistry that make them addicts also confer on them behavioral traits that serve them well.
So, when searching for your organization’s next leader, look for someone with an attenuated dopamine function: someone who is never satisfied with the status quo, someone who wants the feeling of success more than others — but likes it less.

For the full commentary, see:
DAVID J. LINDEN. “Addictive Personality? You Might be a Leader.” The New York Times, SundayReview Section (Sun., July 24, 2011): 4.
(Note: ellipses added.)
(Note: the online version of the commentary is dated July 23, 2011.)

The book mentioned above is:
Linden, David J. The Compass of Pleasure: How Our Brains Make Fatty Foods, Orgasm, Exercise, Marijuana, Generosity, Vodka, Learning, and Gambling Feel So Good. New York: Viking Adult, 2011.

We Tend to Ignore Information that Contradicts Our Beliefs

BelievingBrainBK2011-08-09.jpg

Source of book image: online version of the WSJ review quoted and cited below.

We learn the most when our priors are contradicted. But the dissonance between evidence and beliefs is painful. So we often do not see, or soon forget, evidence that does not fit with our beliefs.
The innovative entrepreneur is often a person who sees and forces herself to remember, the dissonant fact, storing it away to make sense of, or make use of, later. At the start, she may be alone in what she sees and what she remembers. So if we are to benefit from her ability and willingness to bear the pain of dissonance, she must have the freedom to differ, and she must have the financial wherewith-all to support herself until her vision is more widely shared, better understood, and more fruitfully applied.

(p. A13) Beliefs come first; reasons second. That’s the insightful message of “The Believing Brain,” by Michael Shermer, the founder of Skeptic magazine. In the book, he brilliantly lays out what modern cognitive research has to tell us about his subject–namely, that our brains are “belief engines” that naturally “look for and find patterns” and then infuse them with meaning. These meaningful patterns form beliefs that shape our understanding of reality. Our brains tend to seek out information that confirms our beliefs, ignoring information that contradicts them. Mr. Shermer calls this “belief-dependent reality.” The well-worn phrase “seeing is believing” has it backward: Our believing dictates what we’re seeing.
. . .
One of the book’s most enjoyable discussions concerns the politics of belief. Mr. Shermer takes an entertaining look at academic research claiming to prove that conservative beliefs largely result from psychopathologies. He drolly cites survey results showing that 80% of professors in the humanities and social sciences describe themselves as liberals. Could these findings about psychopathological conservative political beliefs possibly be the result of the researchers’ confirmation bias?
As for his own political bias, Mr. Shermer says that he’s “a fiscally conservative civil libertarian.” He is a fan of old-style liberalism, as in liberality of outlook, and cites “The Science of Liberty” author Timothy Ferris’s splendid formulation: “Liberalism and science are methods, not ideologies.” The “scientific solution to the political problem of oppressive governments,” Mr. Shermer says, “is the tried-and-true method of spreading liberal democracy and market capitalism through the open exchange of information, products, and services across porous economic borders.”
But it is science itself that Mr. Shermer most heartily embraces. “The Believing Brain” ends with an engaging history of astronomy that illustrates how the scientific method developed as the only reliable way for us to discover true patterns and true agents at work. Seeing through a telescope, it seems, is believing of the best kind.

For the full review, see:
RONALD BAILEY. “A Trick Of the Mind; Looking for patterns in life and then infusing them with meaning, from alien intervention to federal conspiracy.” The Wall Street Journal (Weds., July 27, 2011): A13.
(Note: ellipsis added.)

Book reviewed:
Shermer, Michael. The Believing Brain: From Ghosts and Gods to Politics and Conspiracies—How We Construct Beliefs and Reinforce Them as Truths. New York: Times Books, 2011.