Your Passion Is Not “Found,” It Is Developed with “Time, Effort and Investment”

(p. B7) People “often assume that their own interest or passion just needs to be ‘found’ or revealed. Once revealed, it will be in a fully formed state,” said Paul A. O’Keefe, an assistant professor of psychology at Yale-NUS College in Singapore. Nonsense, of course, he said.

“By that logic, pursuing one’s passion should come with boundless motivation and should be relatively easy,” he said.

Dr. O’Keefe was part of a team that published a study in 2018 that examined how two different “implicit theories of interest” impacted how people approach new potential passions. One, the fixed theory, says that our interests are relatively fixed and unchanging, while the other, the growth theory, suggests our interests are developed over time and not necessarily innate to our personality.

In other words: Do we truly find our passions, or develop them over time? (You can probably guess where this is going.)

The researchers found that people who hold a fixed theory had less interest in things outside of their current interests, were less likely to anticipate difficulties when pursuing new interests, and lost interest in new things much quicker than people who hold a growth theory. In essence, people with a growth mind-set of interest tend to believe that interests and passions are capable of developing with enough time, effort and investment.

“This comes down to the expectations people have when pursuing a passion,” Dr. O’Keefe said. “Someone with a fixed mind-set of interest might begin their pursuit with lots of enthusiasm, but it might diminish once things get too challenging or tedious.”

Passion alone won’t carry you through in the face of difficulty, he said, when overcoming those challenges actually counts.

For the full story, see:

Stephanie Lee. “Finding Your Passion’ Takes Some Work.” The New York Times (Monday, May 6, 2019): B7.

(Note: the online version of the story has the date April 21 [sic], 2019, and has the title “Why ‘Find Your Passion’ Is Such Terrible Advice.”)

The academic article discussed above, is:

O’Keefe, Paul A., Carol S. Dweck, and Gregory M. Walton. “Implicit Theories of Interest: Finding Your Passion or Developing It?” Psychological Science 29, no. 10 (Oct. 2018): 1653-64.

“Confidence Stops You from Learning”

(p. A15) Mr. Karlgaard, a former publisher of Forbes magazine, has plenty of vivid anecdotes to make his case for late bloomers.

. . .

Bill Walsh, the great coach of the San Francisco 49ers, got his first NFL head coaching job when he was 46 and won his first Super Bowl at 50. He was famously twitchy, self-deprecating and eager to learn, and had this to say about confidence: “In my whole career I’ve been passing men with greater bravado and confidence. Confidence gets you off to a fast start. Confidence gets you that first job and maybe the next two promotions. But confidence stops you from learning. Confidence becomes a caricature after a while. I can’t tell you how many confident blowhards I’ve seen in my coaching career who never got better after the age of forty.”

Late bloomers, Mr. Karlgaard argues, are not just people of great talent who develop later in their lives. They also possess qualities that can only be acquired through time and experience. They tend to be more curious, compassionate, resilient and wise than younger people of equal talent. This may be true, Mr. Karlgaard notes, of older people generally, who are being flushed out of the workforce much too early.

For the full review, see:

Philip Delves Broughton. “THE WEEKEND INTERVIEW; Standing Against Psychiatry’s Crazes.” The Wall Street Journal (Tuesday, April 30, 2019): A15.

(Note: ellipsis added.)

(Note: the online version of the review has the date April 29, 2019, and has the title “BOOKSHELF; ‘Late Bloomers’ Review: Please Don’t Rush Me.”)

The book under review, is:

Karlgaard, Rich. Late Bloomers: The Power of Patience in a World Obsessed with Early Achievement. New York: Currency, 2019.

Machine Learning Finds Female Brains Age Slower Than Male Brains

(p. C4)  . . .  there is fresh evidence that women not only have a longevity advantage; their brains seem to be more youthful throughout adulthood, too.

The new study, published last month [February 2019] in the Proceedings of the National Academy of Science, was led by radiologist Manu Goyal and neurologist Marcus Raichle, both at the Washington University School of Medicine.

. . .

The researchers used machine learning to detect distinctive patterns in the brains they studied. “When we trained it on males and tested it on females, then it guessed the female’s brain age to be three to four years younger than the women’s chronological age,” said Dr. Goyal. Conversely, when the machine was trained to see female metabolic patterns as the standard, it guessed men’s brains to be two to three years older than they actually were. That difference in metabolic brain age added up to approximately a three year advantage for women.

These brain age differences persisted across the adult lifespan and were visible even when people’s brains showed the harbingers of Alzheimer’s disease. “These new findings provide yet more evidence, as if more were needed, of just how ubiquitous sex influences on brain function have proven to be, often showing up in places we least expect them,” said Larry Cahill, a neuroscientist who studies sex differences in the brain at University of California, Irvine.

For the full commentary, see:

(Note:  the online version of the commentary has the date March 27, 2019.)

The published research summarized above, is:

Goyal, Manu S., Tyler M. Blazey, Yi Su, Lars E. Couture, Tony J. Durbin, Randall J. Bateman, Tammie L. S. Benzinger, John C. Morris, Marcus E. Raichle, and Andrei G. Vlassenko. “Persistent Metabolic Youth in the Aging Female Brain.” Proceedings of the National Academy of Sciences 116, no. 8 (Feb. 19, 2019): 3251-3255.

Dogs Feel Guilt When They Hurt Their Humans

(p. 6) Dr. Horowitz concluded that whether dogs take on a guilty look — lowered gaze, ears pressed back, tail rapidly beating between the legs — is unrelated to whether or not they followed orders. If the owner scolds them, they look extremely guilty. If the owner doesn’t, they still sometimes look like this, but less often.

One problem, however, is that our rules are of our own making, such as “Don’t jump on that couch!” or “Keep your nails off my leather chair!” It must be as tough for our pets to grasp these prohibitions as it was for me to understand why I couldn’t chew gum in Singapore.

It would be better to test behavior that is wrong by almost any standard, including that of their own species. The Austrian ethologist Konrad Lorenz gave one of my favorite examples, about his dog, Bully, who broke the fundamental rule never to bite your superior.

Humans don’t need to teach this rule, and indeed Bully had never been punished for it. The dog bit his master’s hand when Dr. Lorenz (p. 7) tried to break up a dogfight. Even though Dr. Lorenz petted him right away, Bully suffered a complete nervous breakdown. For days, he was virtually paralyzed and ignored his food. He would lie on the rug breathing shallowly, occasionally interrupted by a deep sigh. He had violated a natural taboo, which among ancestral canines could have had the worst imaginable consequences, such as expulsion from the pack.

For the full commentary, see:

(Note:  the online version of the commentary has the date March 8, 2019.)

Turing Award Winners’ Neural Networks “Are Still a Very Long Way from True Intelligence”

(p. B3) On Wednesday [March 27, 2019], the Association for Computing Machinery, the world’s largest society of computing professionals, announced that Drs. Hinton, LeCun and Bengio had won this year’s Turing Award for their work on neural networks. The Turing Award, which was introduced in 1966, is often called the Nobel Prize of computing, and it includes a $1 million prize, which the three scientists will share.

. . .

Though these systems have undeniably accelerated the progress of artificial intelligence, they are still a very long way from true intelligence. But Drs. Hinton, LeCun and Bengio believe that new ideas will come.

“We need fundamental additions to this toolbox we have created to reach machines that operate at the level of true human understanding,” Dr. Bengio said.

For the full story, see:

(Note:  ellipsis, and bracketed date, added.)

(Note:  the online version of the story has the date , 2019, and has the title “Turing Award Won by 3 Pioneers in Artificial Intelligence.”)

Neanderthal’s “Body Was Archaic” but “Spirit Was Modern”

(p. B14) Starting in the mid-1950s, leading teams from Columbia University, Dr. Solecki discovered the fossilized skeletons of eight adult and two infant Neanderthals who had lived tens of thousands of years ago in what is now northern Iraq.

Dr. Solecki, who was also a Smithsonian Institution anthropologist at the time, said physical evidence at Shanidar Cave, where the skeletons were found, suggested that Neanderthals had tended to the weak and the wounded, and that they had also buried their dead with flowers, which were placed ornamentally and possibly selected for their therapeutic benefits.

The exhumed bones of a man, named Shanidar 3, who had been blind in one eye and missing his right arm but who had survived for years after he was hurt, indicated that fellow Neanderthals had helped provide him with sustenance and other support.

“Although the body was archaic, the spirit was modern,” Dr. Solecki wrote in the magazine Science in 1975.

Large amounts of pollen found in the soil at a grave site suggested that bodies might have been ceremonially entombed with bluebonnet, hollyhock, grape hyacinth and other flowers — a theory that is still being explored and amplified. (Some researchers hypothesized that the pollen might have been carried by rodents or bees, but Dr. Solecki’s theory has become widely accepted.)

“The association of flowers with Neanderthals adds a whole new dimension to our knowledge of his humanness, indicating he had a ‘soul,’” Dr. Solecki wrote.

For the full obituary, see:

Sam Roberts.  “Ralph Solecki, 101, Archaeologist Who Uncovered the Inner Life of Neanderthals.”  The New York Times  (Wednesday, April 17, 2019):  B14.

(Note:  the online version of the obituary has the date April 11, 2019, and has the title “Ralph Solecki, Who Found Humanity in Neanderthals, Dies at 101.”)

“Ridiculous” to Project “Our Psychology into the Machines”

(p. A8)  . . .  the soft-spoken, 55-year-old Canadian computer scientist, a recipient of this year’s A.M. Turing Award — considered the Nobel Prize for computing — prefers to see the world though the idealism of “Star Trek” rather than the apocalyptic vision of “The Terminator.”

“In ‘Star Trek,’ there is a world in which humans are governed through democracy, everyone gets good health care, education and food, and there are no wars except against some aliens,” said Dr. Bengio, whose research has helped pave the way for speech- and facial-recognition technology, computer vision and self-driving cars, among other things. “I am also trying to marry science with how it can improve society.”

. . .

Cherri M. Pancake, the president of the Association for Computing Machinery, which offers the $1 million award, credited Dr. Bengio and two other luminaries who shared the prize, Geoffrey Hinton and Yann LeCun, with laying the foundation for technologies used by billions of people. “Anyone who has a smartphone in their pocket” has felt their impact, she said, noting that their work also provided “powerful new tools” in the fields of medicine, astronomy and material sciences.

Despite all the accolades, Dr. Bengio recoils at scientists being turned into celebrities. While Dr. Hinton works for Google and Dr. LeCun is the chief A.I. scientist at Facebook, Dr. Bengio has studiously avoided Silicon Valley in favor of a more scholarly life in Montreal, where he also co-founded Element A.I., a software company.

“I’m not a fan of a personalization of science and making some scientists stars,” said Dr. Bengio, a self-described introvert, who colleagues say is happiest when hunched over an algorithm. “I was maybe lucky to be at the right time and thinking the right things.”

Myriam Côté, a computer scientist who has worked with Dr. Bengio for more than a decade, described him as an iconoclast and freethinker who would feel stymied by the strictures of Silicon Valley. A communitarian at heart, she said, he shuns hierarchy and is known for sharing the profits from his own projects with younger, less established colleagues.

“He wants to create in freedom,” she said. Citing the credo of student rebels in 1968 in Paris, where Dr. Bengio was born, she said his philosophy was: “It is forbidden to forbid.”

That, in turn, has informed his approach to A.I.

Even as Stephen Hawking, the celebrated Cambridge physicist, warned that A.I. could be “the worst event in the history of our civilization,” and the billionaire entrepreneur Elon Musk has cautioned it could create an “immortal dictator,” Dr. Bengio has remained more upbeat.

. . .

. . .  he dismissed the “Terminator scenario” in which a machine, endowed with human emotions, turns on its creator. Machines, he stressed, do not have egos and human sentiments, and are not slaves who want to be freed. “We imagine our creations turning against us because we are projecting our psychology into the machines,” he said, calling it “ridiculous.”

For the full story, see:

Dan Bilefsky.  “THE SATURDAY PROFILE; Teaching a Generation of Machines, Far From the Spotlights of Silicon Valley.”  The New York Times (Saturday, March 30, 2019):  A8.

(Note:  ellipses added.)

(Note:  the online version of the story has the date March 29, 2019, and has the title “THE SATURDAY PROFILE;  He Helped Create A.I. Now, He Worries About ‘Killer Robots’.”)

Good Luck Comes to Optimists Who Do Not Give Up

(p. C3) Luck occurs at the intersection of random chance, talent and hard work. There may not be much you can do about the first part of that equation, but there’s a lot you can do about the other two. People who have a talent for making luck for themselves grab the unexpected opportunities that come along.
The good news is that there’s plenty of luck to go around if you know how to look for it.
. . .
Think yourself lucky. Psychologist Martin Seligman of the University of Pennsylvania told us that if he were looking for a lucky person, “the number one ingredient that I’d select for would be optimism.” Early in his career, Dr. Seligman did groundbreaking experiments on learned helplessness, showing that animals put in stressful situations beyond their control eventually stop trying to escape. People also have a tendency to give up and complain when they think they’re victims of bad luck.
“Believing that you have some control over what happens fuels trying,” Dr. Seligman said. “If there’s a potentially good event for me, am I going to seize the opportunity and follow up, or am I going to be passive?”

For the full essay, see:
Janice Kaplan and Barnaby Marsh. “Make Your Own Luck.” The Wall Street Journal (Saturday, March 3, 2018): C3.
(Note: ellipsis added; bold in original.)
(Note: the online version of the essay has the date March 1, 2018, and has the title “To Be Successful, Make Your Own Luck.”)

The essay is based on the authors’ book:
Kaplan, Janice, and Barnaby Marsh. How Luck Happens: Using the Science of Luck to Transform Work, Love, and Life. New York: Dutton, 2018.

Neuroscience Maverick Funds His Own Research

(p. B4) Mr. Hawkins has been following his own, all-encompassing idea for how the brain works. It is a step beyond the projects of most neuroscientists, like understanding the brain of a fruit fly or exploring the particulars of human sight.
His theory starts with cortical columns. Cortical columns are a crucial part of the neocortex, the part of the brain that handles sight, hearing, language and reason. Neuro-(p. B4)scientists don’t agree on how the neocortex works.
Mr. Hawkins says cortical columns handle every task in the same way, a sort of computer algorithm that is repeated over and over again. It is a logical approach to the brain for a man who spent decades building new kinds of computing devices.
All he has to do is figure out the algorithm.
A number of neuroscientists like the idea, and some are pursuing similar ideas. They also praise Mr. Hawkins for his willingness to think so broadly. Being a maverick is not easily done in academia and the world of traditional research. But it’s a little easier when you can fund your own work, as Mr. Hawkins has.
. . .
In 1979, with an article in Scientific American, Francis Crick, a Nobel Prize winner for his DNA research, called for an all-encompassing theory of the brain, something that could explain this “profoundly mysterious” organ.
Mr. Hawkins graduated from Cornell in 1979 with a degree in electrical engineering. Over the next several years, he worked at Intel, the computer chip giant, and Grid Systems, an early laptop company. But after reading that magazine article, he decided the brain would be his life’s work.
He proposed a neuroscience lab inside Intel. After the idea was rejected, he enrolled at the University of California, Berkeley. His doctoral thesis proposal was rejected, too. He was, suffice to say, an outlier.
. . .
U.S. Robotics acquired Palm in 1996 for $44 million. About two years later, Mr. Hawkins and Ms. Dubinksy left to start Handspring. Palm, which became an independent company again in 2000, acquired Handspring for $192 million in stock in 2003.
Around the time of the second sale, Mr. Hawkins built his own neuroscience lab. But it was short-lived. He could not get a lab full of academics focused on his neocortical theory. So, along with Ms. Dubinsky and an A.I. researcher named Dileep George, he founded Numenta.
The company spent years trying to build and sell software, but eventually, after Mr. George left, it settled into a single project. Funded mostly by Mr. Hawkins — he won’t say how much he has spent on it — the company’s sole purpose has been explaining the neocortex and then reverse engineering it.

For the full story, see:
Cade Metz. “A New View of How We Think.” The New York Times (Monday, Oct. 15, 2018): B1 & B4.
(Note: ellipses added.)
(Note: the online version of the story has the date Oct. 14, 2018, and has the title “Jeff Hawkins Is Finally Ready to Explain His Brain Research.”)

Bureaucratic FDA Delays Approvals for Fear “We’ll Be Toast”

(p. A21) Oct. 30 [2018] marks the 36th anniversary of the FDA’s approval of human insulin synthesized in genetically engineered bacteria, the first product made with “gene splicing” techniques. As the head of the FDA’s evaluation team, I had a front-row seat.
. . .
My team and I were ready to recommend approval after four months’ review. But when I took the packet to my supervisor, he said, “Four months? No way! If anything goes wrong with this product down the road, people will say we rushed it, and we’ll be toast.” That’s the bureaucratic mind-set. I don’t know how long he would have delayed it, but when he went on vacation a month later, I took the packet to his boss, the division director, who signed off.
That anecdote is an example of Milton Friedman’s observation that to understand the motivation of an individual or organization, you need to “follow the self-interest.” A large part of regulators’ self-interest lies in staying out of trouble. One way to do that, my supervisor understood, is not to approve in record time products that might experience unanticipated problems.

For the full commentary, see:
Miller, Henry I. “Follow the FDA’s Self-Interest; While approving a new form of insulin, I saw how regulators protect themselves.” The Wall Street Journal (Monday, Oct. 29, 2018: A21.
(Note: ellipsis, and bracketed year, added.)
(Note: the online version of the commentary has the date Oct. 28, 2018.)

Children Younger Than Peers in Class Are More Likely to Be Mis-Diagnosed

(p. A14) Diagnosing attention-deficit hyperactivity disorder is inherently subjective. New research highlights how this can get especially tricky with young children.
It shows that ADHD rates are significantly higher among children who are the youngest in their class compared with those who are the oldest.
ADHD is characterized by difficulty concentrating and constantly active, sometimes disruptive behavior. The study, published in the New England Journal of Medicine last week, found that the youngest children in early elementary school grades have a 32% higher risk of being diagnosed with ADHD than the oldest children.
. . .
“We’re asking children to concentrate and focus when they don’t really have the ability to concentrate and focus yet,” says R. Scott Benson, a child psychiatrist in Pensacola, Fla. “We really want to be more careful, as we get more academic in these younger and younger grades, that we don’t mistake a slight developmental delay as ADHD.”
. . .
Tim Layton, an assistant professor of health-care policy at Harvard Medical School and first author on the study, says the research highlights the importance of pausing before calling a doctor about a child’s unusual behavior. “In fact, it may be the case that that behavior is completely normal, even though it may be disruptive or make the teaching environment difficult,” he says.

For the full commentary, see:
Sumathi Reddy. “YOUR HEALTH; The ADHD Diagnosis Problem.” The Wall Street Journal (Tuesday, Dec. 4, 2018): A14.
(Note: ellipses added.)
(Note: the online version of the commentary has the date Dec. 3, 2018, and has the title “YOUR HEALTH; A Reason to Think Twice About Your Child’s ADHD Diagnosis.”)

The ADHD study mentioned above, is:
Layton, Timothy J., Michael L. Barnett, Tanner R. Hicks, and Anupam B. Jena. “Attention Deficit-Hyperactivity Disorder and Month of School Enrollment.” New England Journal of Medicine 379, no. 22 (Nov. 29, 2018): 2122-30.