Neanderthal’s “Body Was Archaic” but “Spirit Was Modern”

(p. B14) Starting in the mid-1950s, leading teams from Columbia University, Dr. Solecki discovered the fossilized skeletons of eight adult and two infant Neanderthals who had lived tens of thousands of years ago in what is now northern Iraq.

Dr. Solecki, who was also a Smithsonian Institution anthropologist at the time, said physical evidence at Shanidar Cave, where the skeletons were found, suggested that Neanderthals had tended to the weak and the wounded, and that they had also buried their dead with flowers, which were placed ornamentally and possibly selected for their therapeutic benefits.

The exhumed bones of a man, named Shanidar 3, who had been blind in one eye and missing his right arm but who had survived for years after he was hurt, indicated that fellow Neanderthals had helped provide him with sustenance and other support.

“Although the body was archaic, the spirit was modern,” Dr. Solecki wrote in the magazine Science in 1975.

Large amounts of pollen found in the soil at a grave site suggested that bodies might have been ceremonially entombed with bluebonnet, hollyhock, grape hyacinth and other flowers — a theory that is still being explored and amplified. (Some researchers hypothesized that the pollen might have been carried by rodents or bees, but Dr. Solecki’s theory has become widely accepted.)

“The association of flowers with Neanderthals adds a whole new dimension to our knowledge of his humanness, indicating he had a ‘soul,’” Dr. Solecki wrote.

For the full obituary, see:

Sam Roberts.  “Ralph Solecki, 101, Archaeologist Who Uncovered the Inner Life of Neanderthals.”  The New York Times  (Wednesday, April 17, 2019):  B14.

(Note:  the online version of the obituary has the date April 11, 2019, and has the title “Ralph Solecki, Who Found Humanity in Neanderthals, Dies at 101.”)

“Ridiculous” to Project “Our Psychology into the Machines”

(p. A8)  . . .  the soft-spoken, 55-year-old Canadian computer scientist, a recipient of this year’s A.M. Turing Award — considered the Nobel Prize for computing — prefers to see the world though the idealism of “Star Trek” rather than the apocalyptic vision of “The Terminator.”

“In ‘Star Trek,’ there is a world in which humans are governed through democracy, everyone gets good health care, education and food, and there are no wars except against some aliens,” said Dr. Bengio, whose research has helped pave the way for speech- and facial-recognition technology, computer vision and self-driving cars, among other things. “I am also trying to marry science with how it can improve society.”

. . .

Cherri M. Pancake, the president of the Association for Computing Machinery, which offers the $1 million award, credited Dr. Bengio and two other luminaries who shared the prize, Geoffrey Hinton and Yann LeCun, with laying the foundation for technologies used by billions of people. “Anyone who has a smartphone in their pocket” has felt their impact, she said, noting that their work also provided “powerful new tools” in the fields of medicine, astronomy and material sciences.

Despite all the accolades, Dr. Bengio recoils at scientists being turned into celebrities. While Dr. Hinton works for Google and Dr. LeCun is the chief A.I. scientist at Facebook, Dr. Bengio has studiously avoided Silicon Valley in favor of a more scholarly life in Montreal, where he also co-founded Element A.I., a software company.

“I’m not a fan of a personalization of science and making some scientists stars,” said Dr. Bengio, a self-described introvert, who colleagues say is happiest when hunched over an algorithm. “I was maybe lucky to be at the right time and thinking the right things.”

Myriam Côté, a computer scientist who has worked with Dr. Bengio for more than a decade, described him as an iconoclast and freethinker who would feel stymied by the strictures of Silicon Valley. A communitarian at heart, she said, he shuns hierarchy and is known for sharing the profits from his own projects with younger, less established colleagues.

“He wants to create in freedom,” she said. Citing the credo of student rebels in 1968 in Paris, where Dr. Bengio was born, she said his philosophy was: “It is forbidden to forbid.”

That, in turn, has informed his approach to A.I.

Even as Stephen Hawking, the celebrated Cambridge physicist, warned that A.I. could be “the worst event in the history of our civilization,” and the billionaire entrepreneur Elon Musk has cautioned it could create an “immortal dictator,” Dr. Bengio has remained more upbeat.

. . .

. . .  he dismissed the “Terminator scenario” in which a machine, endowed with human emotions, turns on its creator. Machines, he stressed, do not have egos and human sentiments, and are not slaves who want to be freed. “We imagine our creations turning against us because we are projecting our psychology into the machines,” he said, calling it “ridiculous.”

For the full story, see:

Dan Bilefsky.  “THE SATURDAY PROFILE; Teaching a Generation of Machines, Far From the Spotlights of Silicon Valley.”  The New York Times (Saturday, March 30, 2019):  A8.

(Note:  ellipses added.)

(Note:  the online version of the story has the date March 29, 2019, and has the title “THE SATURDAY PROFILE;  He Helped Create A.I. Now, He Worries About ‘Killer Robots’.”)

Good Luck Comes to Optimists Who Do Not Give Up

(p. C3) Luck occurs at the intersection of random chance, talent and hard work. There may not be much you can do about the first part of that equation, but there’s a lot you can do about the other two. People who have a talent for making luck for themselves grab the unexpected opportunities that come along.
The good news is that there’s plenty of luck to go around if you know how to look for it.
. . .
Think yourself lucky. Psychologist Martin Seligman of the University of Pennsylvania told us that if he were looking for a lucky person, “the number one ingredient that I’d select for would be optimism.” Early in his career, Dr. Seligman did groundbreaking experiments on learned helplessness, showing that animals put in stressful situations beyond their control eventually stop trying to escape. People also have a tendency to give up and complain when they think they’re victims of bad luck.
“Believing that you have some control over what happens fuels trying,” Dr. Seligman said. “If there’s a potentially good event for me, am I going to seize the opportunity and follow up, or am I going to be passive?”

For the full essay, see:
Janice Kaplan and Barnaby Marsh. “Make Your Own Luck.” The Wall Street Journal (Saturday, March 3, 2018): C3.
(Note: ellipsis added; bold in original.)
(Note: the online version of the essay has the date March 1, 2018, and has the title “To Be Successful, Make Your Own Luck.”)

The essay is based on the authors’ book:
Kaplan, Janice, and Barnaby Marsh. How Luck Happens: Using the Science of Luck to Transform Work, Love, and Life. New York: Dutton, 2018.

Neuroscience Maverick Funds His Own Research

(p. B4) Mr. Hawkins has been following his own, all-encompassing idea for how the brain works. It is a step beyond the projects of most neuroscientists, like understanding the brain of a fruit fly or exploring the particulars of human sight.
His theory starts with cortical columns. Cortical columns are a crucial part of the neocortex, the part of the brain that handles sight, hearing, language and reason. Neuro-(p. B4)scientists don’t agree on how the neocortex works.
Mr. Hawkins says cortical columns handle every task in the same way, a sort of computer algorithm that is repeated over and over again. It is a logical approach to the brain for a man who spent decades building new kinds of computing devices.
All he has to do is figure out the algorithm.
A number of neuroscientists like the idea, and some are pursuing similar ideas. They also praise Mr. Hawkins for his willingness to think so broadly. Being a maverick is not easily done in academia and the world of traditional research. But it’s a little easier when you can fund your own work, as Mr. Hawkins has.
. . .
In 1979, with an article in Scientific American, Francis Crick, a Nobel Prize winner for his DNA research, called for an all-encompassing theory of the brain, something that could explain this “profoundly mysterious” organ.
Mr. Hawkins graduated from Cornell in 1979 with a degree in electrical engineering. Over the next several years, he worked at Intel, the computer chip giant, and Grid Systems, an early laptop company. But after reading that magazine article, he decided the brain would be his life’s work.
He proposed a neuroscience lab inside Intel. After the idea was rejected, he enrolled at the University of California, Berkeley. His doctoral thesis proposal was rejected, too. He was, suffice to say, an outlier.
. . .
U.S. Robotics acquired Palm in 1996 for $44 million. About two years later, Mr. Hawkins and Ms. Dubinksy left to start Handspring. Palm, which became an independent company again in 2000, acquired Handspring for $192 million in stock in 2003.
Around the time of the second sale, Mr. Hawkins built his own neuroscience lab. But it was short-lived. He could not get a lab full of academics focused on his neocortical theory. So, along with Ms. Dubinsky and an A.I. researcher named Dileep George, he founded Numenta.
The company spent years trying to build and sell software, but eventually, after Mr. George left, it settled into a single project. Funded mostly by Mr. Hawkins — he won’t say how much he has spent on it — the company’s sole purpose has been explaining the neocortex and then reverse engineering it.

For the full story, see:
Cade Metz. “A New View of How We Think.” The New York Times (Monday, Oct. 15, 2018): B1 & B4.
(Note: ellipses added.)
(Note: the online version of the story has the date Oct. 14, 2018, and has the title “Jeff Hawkins Is Finally Ready to Explain His Brain Research.”)

Bureaucratic FDA Delays Approvals for Fear “We’ll Be Toast”

(p. A21) Oct. 30 [2018] marks the 36th anniversary of the FDA’s approval of human insulin synthesized in genetically engineered bacteria, the first product made with “gene splicing” techniques. As the head of the FDA’s evaluation team, I had a front-row seat.
. . .
My team and I were ready to recommend approval after four months’ review. But when I took the packet to my supervisor, he said, “Four months? No way! If anything goes wrong with this product down the road, people will say we rushed it, and we’ll be toast.” That’s the bureaucratic mind-set. I don’t know how long he would have delayed it, but when he went on vacation a month later, I took the packet to his boss, the division director, who signed off.
That anecdote is an example of Milton Friedman’s observation that to understand the motivation of an individual or organization, you need to “follow the self-interest.” A large part of regulators’ self-interest lies in staying out of trouble. One way to do that, my supervisor understood, is not to approve in record time products that might experience unanticipated problems.

For the full commentary, see:
Miller, Henry I. “Follow the FDA’s Self-Interest; While approving a new form of insulin, I saw how regulators protect themselves.” The Wall Street Journal (Monday, Oct. 29, 2018: A21.
(Note: ellipsis, and bracketed year, added.)
(Note: the online version of the commentary has the date Oct. 28, 2018.)

Children Younger Than Peers in Class Are More Likely to Be Mis-Diagnosed

(p. A14) Diagnosing attention-deficit hyperactivity disorder is inherently subjective. New research highlights how this can get especially tricky with young children.
It shows that ADHD rates are significantly higher among children who are the youngest in their class compared with those who are the oldest.
ADHD is characterized by difficulty concentrating and constantly active, sometimes disruptive behavior. The study, published in the New England Journal of Medicine last week, found that the youngest children in early elementary school grades have a 32% higher risk of being diagnosed with ADHD than the oldest children.
. . .
“We’re asking children to concentrate and focus when they don’t really have the ability to concentrate and focus yet,” says R. Scott Benson, a child psychiatrist in Pensacola, Fla. “We really want to be more careful, as we get more academic in these younger and younger grades, that we don’t mistake a slight developmental delay as ADHD.”
. . .
Tim Layton, an assistant professor of health-care policy at Harvard Medical School and first author on the study, says the research highlights the importance of pausing before calling a doctor about a child’s unusual behavior. “In fact, it may be the case that that behavior is completely normal, even though it may be disruptive or make the teaching environment difficult,” he says.

For the full commentary, see:
Sumathi Reddy. “YOUR HEALTH; The ADHD Diagnosis Problem.” The Wall Street Journal (Tuesday, Dec. 4, 2018): A14.
(Note: ellipses added.)
(Note: the online version of the commentary has the date Dec. 3, 2018, and has the title “YOUR HEALTH; A Reason to Think Twice About Your Child’s ADHD Diagnosis.”)

The ADHD study mentioned above, is:
Layton, Timothy J., Michael L. Barnett, Tanner R. Hicks, and Anupam B. Jena. “Attention Deficit-Hyperactivity Disorder and Month of School Enrollment.” New England Journal of Medicine 379, no. 22 (Nov. 29, 2018): 2122-30.

Learning Skills Should Not Be Demeaned as “Training”

(p. A13) One of the few lessons that stuck with me from all the courses I took on the way to earning my Ed.D. came during a classroom discussion that sparked my passion for changing the way we talk about education. I’ll never forget how the professor responded to a student who used the word “training.” Training, the professor admonished, was for animals. Humans receive an education.
We can’t keep speaking of people as if they are animals. Whether an individual acquires a skill credential, a bachelor’s degree, a postgraduate degree or anything in between, it’s all education. We need to think about the words we use and why we use them if we are to break the stigma around all forms of education. If we don’t, we will never overcome the abiding sense of inequality and unfairness that so many Americans feel.

For the full commentary, see:
Virginia Foxx. “Stop Calling It ‘Vocational Training’; How we speak about education reflects class prejudice.” The Wall Street Journal (Wednesday, January 2, 2019): A13.
(Note: the online version of the commentary has the date Dec. 31, 2018.)

Big Data Crushes “Intuition, Skill and Experience”

(p. 14) Drawing on an eclectic bunch of anecdotes and studies, Tenner makes his way through four sectors in which “intuition, skill and experience” have been effectively crushed by “big data, algorithms and efficiency”: media and culture, education, transportation and medicine.
A few of his examples:
Search algorithms have extended the ability to find scientific journal articles and books dating to the 19th century. In principle, this means scholars may encounter a broad range of research and discovery, dredge up forgotten work and possibly connect important dots. But in reality, as one sociologist found after studying citations in 35 million scientific journal articles from before and after the invention of the internet, researchers, beholden to search algorithms’ tendency to generate self-reinforcing feedback loops, are now paying more attention to fewer papers, and in general to the more recent and popular ones — actually strengthening rather than bucking prevailing trends.
GPS is great for getting from one point to another, but if you need more context for understanding your surroundings, it’s fairly useless. We’ve all had experiences in which the shortest distance, as calculated by the app, can also be the most dangerous or traffic-clogged. Compare the efficiency of GPS with the three years aspiring London cabdrivers typically spend preparing for the arduous examination they must pass in order to receive their license. They learn to build a mental map of the entire city, to navigate under any circumstance, to find shortcuts and avoid risky situations — all without any external, possibly fallible, help. Which is the more efficient, ultimately, the cabby or Google Maps?
In the early 2000s, electronic medical records and electronic prescribing appeared to solve the lethal problem of sloppy handwriting. The United States Institute of Medicine estimated in 1999 that 7,000 patients in the United States were dying annually because of errors in reading prescriptions. But the electronic record that has emerged to answer this problem, and to help insurers manage payments, is full of detailed codes and seemingly endless categories and subcategories. Doctors now have to spend an inordinate amount of time on data entry. One 2016 study found that for every hour doctors spent with patients, two hours were given over to filling out paperwork, leaving much less time to listen to patients, arguably the best way to avoid misdiagnoses.
Faced with all these “inefficiently efficient” technologies, what should we do? Tenner wants more balance.

For the full review, see:
Gal Beckerman. ” Kicking the Geeks Where It Hurts.” The New York Times Book Review (Sunday, June 30, 2018): 14.
(Note: the online version of the review has the date June 4, 2018, and has the title “What Silicon Valley Could Use More Of: Inefficiency.”)

The book under review, is:
Tenner, Edward. The Efficiency Paradox: What Big Data Can’t Do. New York: Alfred A. Knopf, 2018.

Obsessive Compulsive Disorder Can Enhance Memory

(p. C1) Sharon remembers the first day it happened, in 1952. She was 5 years old and blindfolded while her friends ran around her, laughing, trying not to be caught in a game of blindman’s bluff. But when she whipped off the scarf, panic set in. The house, the street, even the mountains were in the wrong place. She was totally disoriented.
. . .
She eventually learned she had an unusual condition called developmental topographical disorientation disorder, or DTD.
. . .
(p. C2) Not all brain disorders are as detrimental as DTD. Bob, a TV producer from Los Angeles, remembers every day of his life as if it happened yesterday. His perfect memory is a gift, he says: “I don’t have to mourn people after they’ve passed away because my memory of them is so clear.”
The condition was discovered by James McGaugh at the University of California, Irvine, in 2001, after he received a peculiar email from a woman named Jill. “Since I was 11 I have had this unbelievable ability to recall my past,” she said. “When I see a date…I go back to that day and remember where I was, what I was doing, what day it fell on and on and on.”
. . .
A decade later, Dr. McGaugh had a group of around 50 people with HSAM. By scanning their brains while they carried out memory tasks, he discovered that they had an enlarged caudate nucleus and putamen–two areas implicated in obsessive compulsive disorder. Dr. McGaugh concluded that their extraordinary powers of memory are rooted not in their ability to form memories, but in an unconscious rehearsal of their past. They accidentally strengthen their memories by habitually recalling and reflecting upon them–“a unique form of OCD,” he says.

For the full essay, see:
Helen Thomson. “‘Lessons From STRANGE BRAINS.” The Wall Street Journal (Saturday, June 30, 2018): C1-C2.
(Note: ellipses between quoted passages, added; ellipsis internal to a paragraph, in original.)
(Note: the online version of the essay has the date June 29, 2018, and has the title “Strange Stories of Extraordinary Brains–and What We Can Learn From Them.”)

Thomson’s essay is closely related to her book:
Thomson, Helen. Unthinkable: An Extraordinary Journey through the World’s Strangest Brains. New York: Ecco, 2018.

Star Wars Details Allow “a Fully Believable, Escapist Experience”

(p. A15) Mr. Jameson clearly lays out the qualities that geeks appreciate in their art: realism bolstered by a deep internal history and the sort of “world-building” exemplified by Tolkien. But in Hollywood “Star Wars” changed the game thanks to its verisimilitude, “which immediately and thoroughly convinces viewers that they are watching humans and aliens skip from planet to planet in a vast, crowded other galaxy with its own detailed history.” Similarly, the biological background of the “Alien” series includes Xenomorphs “whose intricate life cycle can be described from beginning to end in grisly detail.” Books like “The Star Trek Encyclopedia,” in which the show’s designers document “all the alien planets and species that they’d invented” and present starship engineering schematics, are quintessential works of geek culture.
Detail is important to geeks, the author suggests, because they want without “any boundaries, any limits. . . . They don’t want the artwork to ever end.” Whether it’s playing a tabletop game filled with lore about previously unknown characters from the “Star Wars” galaxy or reading a “textbook” to study the fantastic beasts of the “Harry Potter” world, geeks want to believe–at least for a bit. As Mr. Jameson says, “geeks have long thought of artworks as places where one can hang out.” That’s one reason why single films have given way to trilogies and why characters have cross-populated to create Marvel’s seemingly endless “cinematic universe.”

For the full review, see:
Brian P. Kelly. “BOOKSHELF; The Geeks Strike Back.” The Wall Street Journal (Friday, June 8, 2018): A15.
(Note: ellipsis in original.)
(Note: the online version of the review has the date June 7, 2018, and has the title “BOOKSHELF; ‘I Find Your Lack of Faith Disturbing’ Review: The Geeks Strike Back; The “Star Wars” franchise and Marvel’s superhero films reign supreme in today’s Hollywood. How did that happen?”)

The book under review, is:
Jameson, A. D. I Find Your Lack of Faith Disturbing: Star Wars and the Triumph of Geek Culture. New York: Farrar, Straus and Giroux, 2018.

Chernobyl Was Due to “Bureaucratic Incompetence,” Not Due to Technology

(p. C6) Dr. Medvedev’s study of Lysenko was not approved for official publication in the Soviet Union, but samizdat, or clandestine, copies circulated among the intelligentsia. In 1969, the book was translated into English and published as “The Rise and Fall of T.D. Lysenko.”
Dr. Medvedev was fired from his job at an agricultural research laboratory, and within a few months was summoned to a meeting with a psychiatrist, on the pretext of discussing the behavior of his teenage son. Instead, Dr. Medvedev was taken to a holding cell, where he managed to pick the lock and walk away.
Soon afterward, on May 29, 1970, as Dr. Medvedev recounted in his book “A Question of Madness,” he was confronted at his home by two psychiatrists accompanied by several police officers.
“‘If you refuse to talk to us,’ one of the psychiatrists told Dr. Medvedev, ‘then we will be obliged to draw the appropriate conclusions . . . And how do you feel yourself, Zhores Aleksandrovich?’
“I answered that I felt marvelous.
“‘But if you feel so marvelous, then why do you think we have turned up here today?’
“‘Obviously, you must answer that question yourself,’ I replied. “A police major arrived. “‘ And who on earth might you be?’ Dr. Medvedev asked. ‘I didn’t invite you here.’ ”
“He protested, to no avail, that the homes of Soviet citizens were considered private and inviolable to the forces of the state.
“‘Get to your feet!” the police major ordered Dr. Medvedev. ‘I order you to get to your feet!’ ”
Two lower-ranking officers, twisted Dr. Medvedev’s arms behind his back, forced him out of his house and into an ambulance. He was driven to a psychiatric hospital.
The preliminary diagnosis was “severe mental illness dangerous to the public,” and Dr. Medvedev was repeatedly warned to stop his “publicist activities.”
Meanwhile, his brother, Sakharov and other activists for greater openness in the Soviet system sent telegrams and published open letters calling for Dr. Medvedev’s release. One of his friends, the novelist Alexander Solzhenitsyn, then still living in the Soviet Union, condemned Dr. Medvedev’s detention with a bold and blistering statement.
“The incarceration of freethinking healthy people in madhouses is spiritual murder,” he said. “It is a fiendish and prolonged torture . . . These crimes will never be forgotten, and all those who take part in them will be condemned endlessly, while they live and after they’re dead.
“It is shortsighted to think that you can live constantly relying on force alone, constantly scorning the objections of conscience.”
Solzhenitsyn received the Nobel Prize for Literature later that year.
. . .
In 1990, Dr. Medvedev wrote an account of the 1986 nuclear disaster at Chernobyl, which he considered inevitable, with the Soviet Union’s history of scientific and bureaucratic incompetence.
“In the end, I was surprised at how poorly designed the reactor actually was,” he told the New York Times in 1990. “I wanted to write this book not only to show the real scale of this particular catastrophe, but also to demolish a few more secrets and deliberate misconceptions.”

For the full obituary, see:
SCHUDEL, Matt. “‘Scientist exposed agricultural fraud and Soviet incompetence.” The Washington Post (Sunday, Sept. 6, 2018): C6.
(Note: ellipses between paragraphs, added; ellipses internal to paragraphs, in original.)
(Note: the online version of the obituary has the date Sept. 4, 2018, and has the title “‘James Mirrlees, Whose Tax Model Earned a Nobel, Dies at 82.”)

The books by Zhores Medvedev that were mentioned above, are:
Medvedev, Zhores A. The Rise and Fall of T. D. Lysenko. New York: Columbia University Press, 1969.
Medvedev, Zhores A., and Roy A. Medvedev. A Question of Madness: Repression by Psychiatry in the Soviet Union. London: Mcmillan London Ltd., 1971.
Medvedev, Zhores A. The Legacy of Chernobyl. New York: W. W. Norton & Company, 1990.