Profits Allow You to Make Great Products, But the Products, Not the Profits, Are the Motivation

The following passage is Steve Jobs speaking, as quoted by Walter Isaacson.

(p. 567) My passion has been to build an enduring company where people were motivated to make great products. Everything else was secondary. Sure, it was great to make a profit, because that was what allowed you to make great products. But the products, not the profits, were the motivation. Sculley flipped these priorities to where the goal was to make money. It’s a subtle difference, but it ends up meaning everything: the people you hire, who gets promoted, what you discuss in meetings.

Source:
Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

Steve Jobs’ “Nasty Edge” Helped Him Create an Apple “Crammed with A Players”

(p. 565) . . . I think . . . [Jobs] actually could have controlled himself, if he had wanted. When he hurt people, it was not because he was lacking in emotional awareness. Quite the contrary: He could size people up, understand their inner thoughts, and know how to relate to them, cajole them, or hurt them at will.
The nasty edge to his personality was not necessary. It hindered him more than it helped him. But it did, at times, serve a purpose. Polite and velvety leaders, who take care to avoid bruising others, are generally not as effective at forcing change. Dozens of the colleagues whom Jobs most abused ended their litany of horror stories by saying that he got them to do things they never dreamed possible. And he created a corporation crammed with A players.

Source:
Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.
(Note: ellipses and bracketed “Jobs” added.)

Behavioral Economists and Psychologists Pledged to Keep Silent on Their Advice to Re-Elect Obama

(p. D1) Late last year Matthew Barzun, an official with the Obama campaign, called Craig Fox, a psychologist in Los Angeles, and invited him to a political planning meeting in Chicago, according to two people who attended the session.
“He said, ‘Bring the whole group; let’s hear what you have to say,’ ” recalled Dr. Fox, a behavioral economist at the University of California, Los Angeles.
So began an effort by a team of social scientists to help their favored candidate in the 2012 presidential election. Some members of the team had consulted with the Obama campaign in the 2008 cycle, but the meeting in January signaled a different direction.
“The culture of the campaign had changed,” Dr. Fox said. “Before then I felt like we had to sell ourselves; this time there was a real hunger for our ideas.”
. . .
(p. D6) When asked about the outside psychologists, the Obama campaign would neither confirm nor deny a relationship with them.
. . .
For their part, consortium members said they did nothing more than pass on research-based ideas, in e-mails and conference calls. They said they could talk only in general terms about the research, because they had signed nondisclosure agreements with the campaign.
In addition to Dr. Fox, the consortium included Susan T. Fiske of Princeton University; Samuel L. Popkin of the University of California, San Diego; Robert Cialdini, a professor emeritus at Arizona State University; Richard H. Thaler, a professor of behavioral science and economics at the University of Chicago’s business school; and Michael Morris, a psychologist at Columbia.
“A kind of dream team, in my opinion,” Dr. Fox said.

For the full story, see:
BENEDICT CAREY. “Academic ‘Dream Team’ Helped Obama’s Effort.” The New York Times (Tues., November 13, 2012): D1 & D6.
(Note: ellipses added.)
(Note: the online version of the story has the date November 12, 2012.)

The Universality of Values: Every Kid Wants a Cell Phone

(p. 528) When they got to Istanbul, . . . [Jobs] hired a history professor to give his family a tour. At the end they went to a Turkish bath, where the professor’s lecture gave Jobs an insight about the globalization of youth:

I had a real revelation. We were all in robes, and they made some Turkish coffee for us. The professor explained how the coffee was made very different from anywhere else, and I realized, “So fucking what?” Which kids even in Turkey give a shit about Turkish coffee? All day I had looked at young people in Istanbul. They were all drinking what every other kid in the world drinks, and they were wearing clothes that look like they were bought at the Gap, and they are all using cell phones. They were like kids everywhere else. It hit me that, for young people, this whole world is the same now. When we’re making products, there is no such thing as a Turkish phone, or a music player that young people in Turkey would want that’s different from one young people elsewhere would want. We’re just one world now.

Source:
Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.
(Note: ellipsis and bracketed “Jobs” added; indented Jobs block quote was indented in the original.)

Fragile Governments Cling to Failed Foreign Aid

AntifragileBK2013-01-11.jpg

Source of book image: http://si.wsj.net/public/resources/images/OB-VL312_bkrvta_DV_20121122124330.jpg

(p. C12) Nassim Nicholas Taleb’s “Antifragile” argues that some people, organizations and systems are resilient in the face of stress because they are able to alter themselves by adapting and learning. The converse is fragility, embodied in entities that are immovable even when faced with shocks or adversity. To my mind, an obvious example is how numerous governments and international agencies have clung to foreign aid as a tool to combat poverty even though aid has failed to deliver sustainable growth and meaningfully reduce indigence. And nation-states, which rest on one unifying vision of the nation, tend to be fragile, while city-states that adjust, adapt and constantly evolve tend to be antifragile. Mr. Taleb’s lesson: Embrace, rather than try to avoid, the shocks.

For the full review essay, see:
Dambisa Moyo (author of passage quoted above, one of 50 contributors to whole article). “Twelve Months of Reading; We asked 50 of our friends to tell us what books they enjoyed in 2012–from Judd Apatow’s big plans to Bruce Wagner’s addictions. See pages C10 and C11 for the Journal’s own Top Ten lists.” The Wall Street Journal (Sat., December 15, 2012): passim (Moyo’s contribution is on p. C12).
(Note: the online version of the review essay has the date December 14, 2012.)

The book under review, is:
Taleb, Nassim Nicholas. Antifragile: Things That Gain from Disorder. New York: Random House, 2012.

“Modern Cognitive Capacity Emerged at the Same Time as Modern Anatomy”

SpearTipsPinnaclePointSouthAfrica2012-01-11.jpg

“ARTIFACTS; The excavations have uncovered caches of advanced stone hunting tools, including spear tips and other small blades, or microliths, which suggest that modern Homo sapiens in Africa had a grasp of complex technologies. The research team’s report challenges a Eurocentric theory of modern human development.” [This photo shows spear tips; another photo included with the article showed three small blades (aka microliths).] Source of quoted part of caption and of photo: online version of the NYT article quoted and cited below.

(p. D3) At a rock shelter on a coastal cliff in South Africa, scientists have found an abundance of advanced stone hunting tools with a tale to tell of the evolving mind of early modern humans at least 71,000 years ago.
. . .
“Ninety percent of scientists are comfortable that fully modern humans and human cognition developed in Africa,” Dr. Marean said. “Now they have moved on. The questions are, how much earlier than 71,000 years did these behaviors emerge? Was it an accretionary process, or was it an abrupt event? Did these people have language by this time?”
Like many other archaeologists, Dr. Marean and his team have concentrated their investigations in the caves and rock shelters overlooking the Indian Ocean. In a global ice age beginning 72,000 years ago, many Africans fled the continent’s arid interior, heading for the more benign southern shore. Access to seafood and more plentiful plant and animal resources may have increased populations and encouraged technological advances, Dr. Marean said.
The well-preserved artifacts at Pinnacle Point, collected over a recent 18-month period, led the researchers to conclude that the advanced technologies in Africa “were early and enduring.” Other archaeologists who reached different conclusions may have been misled by the “small sample of excavated sites,” they said.
Richard G. Klein, a paleoanthropologist at Stanford University who has favored a more sudden and recent origin of modern behavior, about 50,000 years ago, questioned the reliability of the dating method for the tools, noting that “there is another team that has already argued for a much longer” time period for the toolmaking culture.
. . .
The hypothesis of earlier African origins of modern human behavior and cognition has been gaining strength over the last decade or two. Two archaeologists, Alison S. Brooks of George Washington University and Sally McBrearty of the University of Connecticut, led the charge with publications of their analysis of increasing evidence of African art and ornamentations expressing a modern cognitive capacity and symbolic thinking.
In a commentary accompanying the Nature report, Dr. McBrearty, who was not involved in the research, wrote that she believed that “modern cognitive capacity emerged at the same time as modern anatomy, and that various aspects of human culture arose gradually” over the course of subsequent millenniums.

For the full story, see:
JOHN NOBLE WILFORD. “Stone Tools Point to Creative Work by Early Humans in Africa.” The New York Times (Tues., November 13, 2012): D3.
(Note: ellipses added.)
(Note: the online version of the story has the date November 12, 2012.)

The research discussed in the passages quoted above, appeared in Nature:
Brown, Kyle S., Curtis W. Marean, Zenobia Jacobs, Benjamin J. Schoville, Simen Oestmo, Erich C. Fisher, Jocelyn Bernatchez, Panagiotis Karkanas, and Thalassa Matthews. “An Early and Enduring Advanced Technology Originating 71,000 Years Ago in South Africa.” Nature 491, no. 7425 (22 November 2012): 590-93.

Kahneman Says “Intuitive Thinking” Is “the Origin of Most of What We Do Right–Which Is Most of What We Do”

(p. 415) The investment of attention improves, performance in numerous activities–think of the risks of driving through a narrow space while your mind is wandering-and is essential to some tasks, including comparison, choice, and ordered reasoning. However, System 2 is not a paragon of rationality. Its abilities are limited and so is the knowledge to which it has access. We do not always think straight when we reason, and the errors are not always due to intrusive and incorrect intuitions. Often we make mistakes because we (our System 2) do not know any better.
I have spent more time describing System 1, and have devoted many (p. 416) pages to errors of intuitive judgment and choice that I attribute to it. However, the relative number of pages is a poor indicator of the balance between the marvels and the flaws of intuitive thinking. System 1 is indeed the origin of much that we do wrong, but it is also the origin of most of what we do right–which is most of what we do. Our thoughts and actions are routinely guided by System 1 and generally are on the mark. One of the marvels is the rich and detailed model of our world that is maintained in associative memory: it distinguishes surprising from normal events in a fraction of a second, immediately generates an idea of what was expected instead of a surprise, and automatically searches for some causal interpretation of surprises and of events as they take place.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

The Precautionary Principle Would Have Blocked Many Great Innovations

(p. 351) The intense aversion to trading increased risk for some other advantage plays out on a grand scale in the laws and regulations governing risk. This trend is especially strong in Europe where the precautionary principle, which prohibits any action that might cause harm, is a widely accepted doctrine. In the regulatory context, the precautionary principle imposes the entire burden of proving safety on anyone who undertakes actions that might harm people or the environment. Multiple international bodies have specified that the absence of scientific evidence of potential damage is not sufficient justification for taking risks. As the jurist Cass Sunstein points out, the precautionary principle is costly, and when interpreted strictly it can be paralyzing. He mentions an impressive list of innovations that would not have passed the test, including “airplanes, air conditioning, antibiotics, automobiles, chlorine, the measles vaccine, open-heart surgery, radio, refrigeration, smallpox vaccine, and X-rays.” The strong version of the precautionary principle is obviously untenable. But enhanced loss aversion is embedded in a strong and widely shared moral intuition; it originates in System 1. The dilemma between intensely loss-averse moral attitudes and efficient risk management does not have a simple and compelling solution.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: italics in original.)

Sunk-Cost Fallacy “Can Be Overcome”

(p. 346) The sunk-cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects. I have often observed young scientists struggling to salvage a doomed project when they would be better advised to drop it and start a new one. Fortunately, research suggests that at least in some contexts the fallacy can be overcome. The sunk-cost fallacy is identified and taught as a mistake in both economics and business courses, apparently to good effect: there is evidence that graduate students in these fields are more willing than others to walk away from a failing project.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

Kahneman Preaches that People Can and Should Act More Rationally

(p. 338) . . . I have a sermon ready for Sam if he rejects the offer of a single highly favorable gamble played once, and for you if you share his unreason-able aversion to losses:

I sympathize with your aversion to losing any gamble, but it is costing you a lot of money. Please consider this question: Are you on your deathbed? Is this the last offer of a small favorable gamble that you will ever consider? Of course, you are unlikely to be offered exactly this gamble again, but you will have many opportunities to consider attractive gambles with stakes that are very small relative to your wealth. You will do yourself a large financial favor if you are able to see each of these gambles as part of a bundle of small gambles and rehearse the mantra that will get you significantly closer to economic rationality: you win a few, you lose a few. The main purpose of the mantra is to control your emotional response when you do lose. If you can trust it to be effective, you should remind yourself of it when deciding whether or not to accept a small risk with positive expected value. Remember these qualifications when using the mantra:

  • It works when the gambles are genuinely independent of each other; it does not apply to multiple investments in the same industry, which would all go bad together.

(p. 339)

  • It works only when the possible loss does not cause you to worry about your total wealth. If you would take the loss as significant bad news about your economic future, watch it!
  • It should not be applied to long shots, where the probability of winning is very small for each bet.

If you have the emotional discipline that this rule requires, you will never consider a small gamble in isolation or be loss averse for a small gamble until you are actually on your deathbed and not even then.

This advice is not impossible to follow. Experienced traders in financial markets live by it every day, shielding themselves from the pain of losses by broad framing. As was mentioned earlier, we now know that experimental subjects could be almost cured of their loss aversion (in a particular context) by inducing them to “think like a trader,” just as experienced baseball card traders are not as susceptible to the endowment effect as novices are. Students made risky decisions (to accept or reject gambles in which they could lose) under different instructions. In the narrow-framing condition, they were told to “make each decision as if it were the only one” and to accept their emotions. The instructions for broad framing of a decision included the phrases “imagine yourself as a trader,” “you do this all the time,” and “treat it as one of many monetary decisions, which will sum together to produce a ‘portfolio’.” The experimenters assessed the subjects’ emotional response to gains and losses by physiological measures, including changes in the electrical conductance of the skin that are used in lie detection. As expected, broad framing blunted the emotional reaction to losses and increased the willingness to take risks.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
(Note: ellipsis added; italics in original.)

Reference Point Ignored Due to “Theory-Induced Blindness”

(p. 290) The omission of the reference point from the indifference map is a surprising case of theory-induced blindness, because we so often encounter cases in which the reference point obviously matters. In labor negotiations, it is well understood by both sides that the reference point is the existing contract and that the negotiations will focus on mutual demands for concessions relative to that reference point. The role of loss aversion in bargaining is also well understood: making concessions hurts. You have much (p. 291) personal experience of the role of reference point. If you changed jobs or locations, or even considered such a change, you surely remember that the features of the new place were coded as pluses or minuses relative to where you were. You may also have noticed that disadvantages loomed larger than advantages in this evaluation–loss aversion was at work. It is difficult to accept changes for the worse. For example, the minimal wage that unemployed workers would accept for new employment averages 90% of their previous wage, and it drops by less than 10% over a period of one year.

Source:
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.