Our Brains Learn in a Process of Continuous Bayesian Updating

(p. A13) First articulated in the 18th century by a hobbyist-mathematician seeking to reason backward from effects to cause, Bayes’ theorem spent the better part of two centuries struggling for recognition and respect. Yet today, argues Tom Chivers in “Everything Is Predictable,” it can be seen as “perhaps the most important single equation in history.” It drives the logic of spam filters, artificial intelligence and possibly our own brains. . . .

At its core, the theorem provides a quantitative method for getting incrementally wiser by continuously updating what you think you know—your prior beliefs, which initially might be subjective—with new information. Your refined belief becomes the new prior, and the process repeats.

. . .

At times Mr. Chivers, a London-based science journalist who now writes for Semafor, seems overwhelmed by an admittedly complex subject, and his presentation lacks the clarity of Sharon Bertsch McGrayne’s “The Theory That Would Not Die” (2011). Yet he is onto something, since Bayes’ moment has clearly arrived. He notes that Bayesian reasoning is popular among “people who come from the new schools of data science—machine learning, Silicon Valley tech folks.” The mathematician Aubrey Clayton tells him that, in the cutting-edge realms of software engineering, “Bayesian methods are what you’d use.”

. . .

It’s notoriously difficult for most people to grasp problems in a structured Bayesian fashion. Suppose there is a test for a rare disease that is 99% accurate. You’d think that, if you tested positive, you’d probably have the disease. But when you figure in the prior—the fact that, for the average person (without specific risk factors), the chance of having a rare disease is incredibly low—then even a positive test means you’re still unlikely to have it. When quizzed by researchers, doctors consistently fail to consider prevalence—the relevant prior—in their interpretation of test results. Even so, Mr. Chivers insists, “our instinctive decision-making, from a Bayesian perspective, isn’t that bad.” And indeed, in practice, doctors quickly learn to favor common diagnoses over exotic possibilities.

. . .

Our brains work by making models of the world, Mr. Chivers reminds us, assessing how our expectations match what we earn from our senses, and then updating our perceptions accordingly. Deep down, it seems, we are all Bayesians.

For the full review, see:

David A. Shaywitz. “Thinking Prior to Thought.” The Wall Street Journal (Thursday, May 15, 2024): A13.

(Note: the online version of the review has the date May 14, 2024, and has the title “‘Everything Is Predictable’ Review: The Secret of Bayes.” In the last quoted sentence I have replaced the word “earn” that appears in both the online and print versions, with the word “learn.”)

The book under review is:

Chivers, Tom. Everything Is Predictable: How Bayesian Statistics Explain Our World. New York: Atria/One Signal Publishers, 2024.

Patient-Reported Health Information Deserves Respect

Patients may have more accurate knowledge of their health than the information found in doctors’ blood tests, as reported in the study summarized below. The credibility of patient self-knowledge provides an added reason, besides respect for freedom, why government should not mandate an individual’s food and drug decisions.

(p. D4) . . . a . . . study . . . suggests that how patients say they feel may be a better predictor of health than objective measures like a blood test. The study, published in Psychoneuroendocrinology, used data from 1,500 people who took part in the Texas City Stress and Health Study, which tracked the stress and health levels of people living near Houston.

. . .

The study found that when people said they felt poorly, they had high virus and inflammation levels. People who reported feeling well had low virus and inflammation levels.

“I think the take-home message is that self-reported health matters,” said Christopher P. Fagundes, an assistant psychology professor at Rice University and a co-author of the study. “Physicians should pay close to attention to their patients. There are likely biological mechanisms underlying why they feel their health is poor.”

For the full story see:

Tara Parker-Pope. “Doctors, Listen to Patients.” The New York Times (Tuesday, July 19, 2016 [sic]): D4.

(Note: ellipses added.)

(Note: the online version of the story has the date July 15, 2016 [sic], and has the title “Doctors Should Listen to Patient Instincts.”)

The academic paper co-authored by Fagundes and mentioned above is:

Murdock, Kyle W., Christopher P. Fagundes, M. Kristen Peek, Vansh Vohra, and Raymond P. Stowe. “The Effect of Self-Reported Health on Latent Herpesvirus Reactivation and Inflammation in an Ethnically Diverse Sample.” Psychoneuroendocrinology 72 (Oct. 2016): 113-18.

Ending the “License Raj” in India Allowed Economic Growth and the Creation of Earned Entrepreneurial Wealth

(p. A8) The younger son of Mukesh Ambani, India’s richest man, is set to wed his fiancée in Mumbai on Friday, the finale of a monthslong extravaganza that signaled the arrival of the unapologetic Indian billionaire on the global stage — and introduced the world to the country’s Gilded Age.

. . .

Kavil Ramachandran, a professor of entrepreneurship at the Indian School of Business, said there were more billionaires with fatter wallets because India has sustained a high growth rate for more than two decades. That’s created a deep domestic market for goods and services, and pushed Indian companies to pursue new businesses, pairing opportunity with ambition.

“It’s a consequence of rapid growth and entrepreneurialism,” Mr. Ramachandran said.

. . .

India has come a long way from its socialist origins. Until 1990, the country operated under strict government supervision and protectionist policies. Companies could only run after procuring multiple permits and licenses from the government, leading to the name “License Raj” — a play on the term British Raj, which referred to colonial rule.

Once India opened up its economy after a series of reforms, some domestic companies embraced the logic of free markets while remaining family-run and tightly controlled, diversifying into new businesses.

. . .

Many Indians see in Mr. Ambani’s staggering rise in stature and wealth a version of the India they want: a country that doesn’t make a play for attention but demands it. Some even feel pride that his son’s wedding has attracted such global attention. To them, India’s poverty is a predictable fact, such opulence is not.

“Based on the level of the Ambanis’ wealth, the wedding is perfect,” said Mani Mohan Parmar, a 64-year-old resident from Mumbai.

“Even the common man here in India spends more than his capacity on a wedding,” Ms. Parmar said. “So it’s nothing too much if we talk about Ambani. He has so much money due to God’s grace, so why shouldn’t he spend it by his choice?”

For the full story see:

Anupreeta Das. “India’s New Gilded Age on Display at a Wedding.” The New York Times (Monday, June 15, 2024): A8.

(Note: ellipses added.)

(Note: the online version of the story has the date July 12, 2023, and has the title “A Wedding Puts India’s Gilded Age on Lavish Display.”)

Following Salt Consumption Guidelines Increases Risk of Death

Official experts often turn out to be wrong, as in the salt consumption guidelines discussed below. The fallibility of expert knowledge provides an added reason, besides respect for freedom, why government should not mandate an individual’s food and drug decisions.

(p. D4) People with high blood pressure are often told to eat a low-sodium diet. But a diet that’s too low in sodium may actually increase the risk for cardiovascular disease, a review of studies has found.

Current guidelines recommend a daily maximum of 2.3 grams of sodium a day — the amount found in a teaspoon of salt — for most people, and less for the elderly or people with hypertension.

Researchers reviewed four observational studies that included 133,118 people who were followed for an average of four years. The scientists took blood pressure readings, and estimated sodium consumption by urinalysis. The review is in The Lancet.

Among 69,559 people without hypertension, consuming more than seven grams of sodium daily did not increase the risk for disease or death, but those who ate less than three grams had a 26 percent increased risk for death or for cardiovascular events like heart disease and stroke, compared with those who consumed four to five grams a day.

In people with high blood pressure, consuming more than seven grams a day increased the risk by 23 percent, but consuming less than three grams increased the risk by 34 percent, compared with those who ate four to five grams a day.

For the full story see:

Nicholas Bakalar. “Low-Salt Diet as a Heart Risk.” The New York Times (Tuesday, Oct. 11, 2016 [sic]): D4.

(Note: ellipses added.)

(Note: the online version of the story has the date May 25, 2016 [sic], and has the title “A Low-Salt Diet May Be Bad for the Heart.”)

The academic paper reporting the results summarized above is:

Mente, Andrew, Martin O’Donnell, Sumathy Rangarajan, Gilles Dagenais, Scott Lear, Matthew McQueen, Rafael Diaz, Alvaro Avezum, Patricio Lopez-Jaramillo, Fernando Lanas, Wei Li, Yin Lu, Sun Yi, Lei Rensheng, Romaina Iqbal, Prem Mony, Rita Yusuf, Khalid Yusoff, Andrzej Szuba, Aytekin Oguz, Annika Rosengren, Ahmad Bahonar, Afzalhussein Yusufali, Aletta Elisabeth Schutte, Jephat Chifamba, Johannes F. E. Mann, Sonia S. Anand, Koon Teo, and S. Yusuf. “Associations of Urinary Sodium Excretion with Cardiovascular Events in Individuals with and without Hypertension: A Pooled Analysis of Data from Four Studies.” The Lancet 388, no. 10043 (2016): 465-75.

Techno-Optimist Claims AI Tools “Will Help Scientists Design Therapies Faster and Better”

(p. A13) It is said that triumphant Roman generals, to ensure that the rapture of victory didn’t go to their heads, would require a companion to whisper in their ear: “Remember, you are only a man.” Jamie Metzl worries that we may have learned all too well such lessons in humility. Given remarkable recent advances in technology—and the promise of more to come—we need to lean into our emerging godlike powers, he believes, and embrace the opportunity to shape the world into a better place. In “Superconvergence,” he sets out to show us how, after first helping us overcome our hesitations.

. . .

. . . the big advances will be in medicine—and indeed are already in evidence. Mr. Metzl points to the blisteringly fast development of the Covid-19 mRNA vaccine, from digital file to widespread immunization in less than a year; and to gene-editing technologies like Crispr. He cites the experience of Victoria Gray, a young woman from Mississippi who was suffering from sickle-cell disease until, in 2019, researchers in Nashville, Tenn., reinfused her with her own cells, which had been Crispr-edited; the treatment worked, liberating her from the disease’s tormenting pain and crippling fatigue. For Mr. Metzl, these are just the first intimations of a revolution to come. AI tools like DeepMind’s Alphafold, he says, will help scientists design therapies faster and better.

To get smarter about human health, though, AI will need more information, and here Mr. Metzl’s ebullience edges toward the willful suspension of disbelief. His imagined future of healthcare will require “collecting huge amounts of genetic and systems biology data in massive and searchable databases.” The details will include not only medical records and the results of laboratory tests but data from the sensors he anticipates will be everywhere—“bathrooms, bedrooms, and offices”—as information is hoovered up from “toilets, mirrors, computers, phones and other devices without the people even noticing.” While acknowledging that such a scenario sounds like “an authoritarian’s dream and a free person’s nightmare,” he suggests that the chance to catch disease early may offset the risks. This trade-off promises to be a tough sell.

More than many techno-optimists, Mr. Metzl seems to grasp the intricacy of biological systems; he notes that they are beyond our full understanding right now. Even so, a time will come when “the sophistication of our tools and understanding meets and then exceeds the complexity of biology.”

For the full review, see:

David A. Shaywitz. “Getting Better, Faster.” The Wall Street Journal (Thursday, July 11, 2024): A13.

(Note: the online version of the review has the date July 10, 2024, and has the title “‘Superconvergence’ Review: Getting Better, Faster.”)

The book under review is:

Metzl, Jamie. Superconvergence: How the Genetics, Biotech, and AI Revolutions Will Transform Our Lives, Work, and World. New York: Timber Press, 2024.

Copper Hospital Fixtures Would Reduce Bacterial Infections

If healthcare was unregulated, nimble entrepreneurs could make quick use of the findings summarized below. In our sclerotic hyper-regulated healthcare system, healthcare workers have neither the incentives nor the decision rights to make use of them.

(p. D6) Researchers equipped nine rooms in a small rural hospital with copper faucet handles, toilet flush levers, door handles, light switches and other commonly touched equipment. They left nine other rooms with traditional plastic, porcelain and metal surfaces.

. . .

. . . on average, fixtures in copper-equipped rooms had concentrations of bacteria about 98 percent lower than in rooms furnished with traditional equipment, whether the rooms were occupied or not. On half of the copper components, the researchers were unable to grow any bacteria at all.

“Copper in hospital rooms is not yet common,” said the lead author, Shannon M. Hinsa-Leasure, an associate professor of biology at Grinnell College in Iowa.

For the full story see:

Nicholas Bakalar. “Copper May Stem Infections.” The New York Times (Tuesday, Oct. 11, 2016 [sic]): D6.

(Note: ellipses added.)

(Note: the online version of the story has the date Oct. 4, 2016 [sic], and has the title “Copper Sinks and Faucets May Stem Hospital Infections.”)

The academic paper reporting the results summarized above is:

Hinsa-Leasure, Shannon M., Queenster Nartey, Justin Vaverka, and Michael G. Schmidt. “Copper Alloy Surfaces Sustain Terminal Cleaning Levels in a Rural Hospital.” American Journal of Infection Control 44, no. 11 (Nov. 2016): e195-e203.

Human Ancestor 1.45-Million Years Ago Was a Victim of Cannibalism

Modern capitalism is sometimes criticized as inferior to a long-ago golden age. A past golden age is a myth. Human ancestors suffered from cannibalism and other violence.

(p. D3) In today’s scholar-eat-scholar world of paleoanthropology, claims of cannibalism are held to exacting standards of evidence. Which is why more than a few eyebrows were raised earlier this week over a study in Scientific Reports asserting that a 1.45-million-year-old fragment of shin bone — found 53 years ago in northern Kenya, and sparsely documented — was an indication that our human ancestors not only butchered their own kind, but were probably, as an accompanying news release put it, “chowing down” on them, too.

The news release described the finding as the “oldest decisive evidence” of such behavior. “The information we have tells us that hominids were likely eating other hominids at least 1.45 million years ago,” Briana Pobiner, a paleoanthropologist at the Smithsonian’s National Museum of Natural History and first author of the paper, said in the news release.

. . .

Dr. Pobiner, an authority on cut marks, had spied the half-tibia fossil six summers ago while examining hominid bones housed in a Nairobi museum vault. She was inspecting the fossil for bite marks when she noticed 11 thin slashes, all angled in the same direction and clustered around a spot where a calf muscle would have attached to the bone — the meatiest chunk of the lower leg, Dr. Pobiner said in an interview.

She sent molds of the scars to Michael Pante, a paleoanthropologist at Colorado State University and an author on the study, who made 3-D scans and compared the shape of the incisions with a database of 898 tooth, trample and butchery marks. The analysis indicated that nine of the markings were consistent with the kind of damage made by stone tools. Dr. Pobiner said that the placement and orientation of the cuts implied that flesh had been stripped from the bone. From those observations she extrapolated her cannibalism thesis.

“From what we can tell, this hominin leg bone is being treated like other animals, which we presume are being eaten based on lots of butchery marks on them,” Dr. Pobiner said. “It makes the most sense to presume that this butchery was also done for the purpose of eating.”

. . .

. . ., clear proof of systematic cannibalism among hominids has emerged in the fossil record. The earliest confirmation was uncovered in 1994 in the Gran Dolina cave site of Spain’s Atapuerca Mountains. The remains of 11 individuals who lived some 800,000 years ago displayed distinctive signs of having been eaten, with bones displaying cuts, fractures where they had been cracked open to expose the marrow and human tooth marks.

Among our other evolutionary cousins now confirmed to have practiced cannibalism are Neanderthals, with whom humans overlapped, and mated, for thousands of years. A study published in 2016 reported that Neanderthal bones found in a cave in Goyet, Belgium, and dated to roughly 40,000 B.C. show signs of being butchered, split and used to sharpen the edges of stone tools. Patterns of bone-breakage in Homo antecessor, considered the last common ancestor of Neanderthals and Homo sapiens, suggest that cannibalism goes back a half-million years or more.

For the full story see:

Franz Lidz. “For Paleoanthropology, Cannibalism Can Be Clickbait.” The New York Times (Tuesday, June 11, 2024): D3.

(Note: ellipses added.)

(Note: the online version of the story was updated July 3, 2023, and has the title “Cannibalism, or ‘Clickbait’ for Paleoanthropology?”)

The study in Scientific Reports mentioned above is:

Pobiner, Briana, Michael Pante, and Trevor Keevil. “Early Pleistocene Cut Marked Hominin Fossil from Koobi Fora, Kenya.” Scientific Reports 13, no. 1 (2023): 9896.

Kahneman’s “Adversarial Collaboration” Might Bring Us More Joy and Better Science

(p. A19) Professor Kahneman, who died . . . at the age of 90, is best known for his pathbreaking explorations of human judgment and decision making and of how people deviate from perfect rationality. He should also be remembered for a living and working philosophy that has never been more relevant: his enthusiasm for collaborating with his intellectual adversaries. This enthusiasm was deeply personal. He experienced real joy working with others to discover the truth, even if he learned that he was wrong (something that often delighted him).

. . .

Professor Kahneman saw . . . “angry science,” which he described as a “nasty world of critiques, replies and rejoinders” and “as a contest, where the aim is to embarrass.” As Professor Kahneman put it, those who live in that nasty world offer “a summary caricature of the target position, refute the weakest argument in that caricature and declare the total destruction of the adversary’s position.” In his account, angry science is “a demeaning experience.”

. . .

Professor Kahneman meant both to encourage better science and to strengthen the better angels of our nature.

For the full commentary see:

Cass R. Sunstein. “The Value of Collaborating With Adversaries.” The New York Times (Wednesday, April 3, 2024): A19.

(Note: ellipses added.)

(Note: the online version of the commentary has the date April 1, 2024, and has the title “The Nobel Winner Who Liked to Collaborate With His Adversaries.”)

“Righteous Rage” Against “The Absence of a Cure”

The commentary quoted below advocates a productive rage against “the dying of the light” (Dylan Thomas); a sense of urgency.

(p. D6) Dr. Sacks, who recorded his heavenly highs and hellish lows in “A Leg to Stand On,” believes that those with a disability often oscillate between grateful rejoicing and bitter denouncing of their circumstances. The same dynamic may hold true for cancer patients. So how does one sustain the joy while avoiding the rancor?

There can be no simple answer, but I seek clues in works of art created by terminal cancer patients. Take, for example, the paintings of Hollis Sigler, which have been shown in hospitals across the country and collected in the volume “Hollis Sigler’s Breast Cancer Journal.”

. . .  She depicts a shocking lack of control in a painting with food and silverware unexpectedly flying from a table in a tornado of debris. The image reminds me of Dr. Benedict B. Benigno’s perspective on cancer: “If life is a banquet, then cancer takes away the knife and fork and pulls the chair out from under us.”

To document the ravages of metastatic breast cancer, Ms. Sigler, who died of the disease in 2001, used spacers between frames for prose on the dire statistics and facts she had learned. On the edges of the paintings, she recorded additional words from her journals and those of the poet and breast cancer activist Audre Lorde. Bitterness and rancor certainly get expressed in these testimonies, but righteous rage is channeled toward the real enemies: the absence of a cure, the lack of preventive measures, inadequate detection tools, degrading and injurious treatments, miserable mortality rates, contaminants in the environment.

. . .

Not repressing but directing her anger, Ms. Sigler managed, through her dazzling artistry, to contest and revise the poet W. H. Auden’s advice: “Let your last thinks all be thanks.”

For the full commentary see:

Susan Gubar. “Living With Cancer: Curses and Blessings.” The New York Times (Tuesday, July 21, 2015 [sic]): D6.

(Note: ellipses added.)

(Note: the online version of the commentary has the date July 16, 2015 [sic], and has the same title as the print version.)

The book of paintings by Hollis Sigler, mentioned above, is:

Sigler, Hollis, Susan M. Love, and James Yood. Hollis Sigler’s Breast Cancer Journal. New York: Hudson Hills Press, 1999.

Perfusion Eases the Scarcity of Organs for Transplantation

(p. A1) Surgeons are experimenting with organs from genetically modified animals, hinting at a future when they could be a source for transplants. But the field is already undergoing a paradigm shift, driven by technologies in widespread use that allow clinicians to temporarily store organs outside the body.

Perfusion, as its called, is changing every aspect of the organ transplant process, from the way surgeons operate, to the types of patients who can donate organs, to the outcomes for recipients.

. . .

(p. A12) Scientists have long experimented with techniques for keeping organs in more dynamic conditions, at a warmer temperature and perfused with blood or another oxygenated solution. After years of development, the first device for preserving lungs via perfusion won approval from the Food and Drug Administration in 2019. Devices for perfusing hearts and livers were approved in late 2021.

. . .

Now surgical teams can recover an organ, perfuse it overnight while they sleep and complete the transplant in the morning without fear that the delay will have damaged the organ.

Perhaps most important, perfusion has further opened the door to organ donation by comatose patients whose families have withdrawn life support, allowing their hearts to eventually stop. Each year, tens of thousands of people die this way, after the cessation of circulation, but they were rarely donor candidates because the dying process deprived their organs of oxygen.

. . .

By tapping this new cadre of donors, transplant centers said they could find organs more quickly for the excess of patients in urgent need. Dr. Shimul Shah said the organ transplant program he directs at the University of Cincinnati had essentially wiped out its waiting list for livers. “I never thought, in my career, I would ever say that,” he said.

. . .

Dr. Shaf Keshavjee, a surgeon at the University of Toronto whose lab was at the forefront of developing technologies to preserve lungs outside the body, said the devices could eventually allow doctors to remove, repair and return lungs to sick patients rather than replace them. “I think we can make organs that will outlive the recipient you put them in,” he said.

Dr. Ashish Shah, the chairman of cardiac surgery at Vanderbilt University, one of the busiest heart transplant programs in the country, agreed, calling that “the holy grail.”

“Your heart sucks,” he said. “I take it out. I put it on my apparatus. While you don’t have a heart, I can support you with an artificial heart for a little while. I then take your heart and fix it — cells, mitochondria, gene therapy, whatever — and then I sew it back in. Your own heart. That’s what we’re really working for.”

For the full story see:

Ted Alcorn. “Keeping Organs For Transplants Alive for Longer.” The New York Times (Wednesday, April 3, 2024): A1 & A12.

(Note: ellipses added.)

(Note: the online version of the story has the date April 2, 2024, and has the title “The Organ Is Still Working. But It’s Not in a Body Anymore.”)

Since Wood Tools Are Rarely Preserved, “Preservation Bias Distorts Our View of Antiquity”

(p. D3) In 1836, Christian Jürgensen Thomsen, a Danish antiquarian, brought the first semblance of order to prehistory, suggesting that the early hominids of Europe had gone through three stages of technological development that were reflected in the production of tools. The basic chronology — Stone Age to Bronze Age to Iron Age — now underpins the archaeology of most of the Old World (and cartoons like “The Flintstones” and “The Croods”).

Thomsen could well have substituted Wood Age for Stone Age, according to Thomas Terberger, an archaeologist and head of research at the Department of Cultural Heritage of Lower Saxony, in Germany.

“We can probably assume that wooden tools have been around just as long as stone ones, that is, two and a half or three million years,“ he said. “But since wood deteriorates and rarely survives, preservation bias distorts our view of antiquity.” Primitive stone implements have traditionally characterized the Lower Paleolithic period, which lasted from about 2.7 million years ago to 200,000 years ago. Of the thousands of archaeological sites that can be traced to the era, wood has been recovered from fewer than 10.

Dr. Terberger was team leader of a study published last month in the Proceedings of the National Academy of Sciences that provided the first comprehensive report on the wooden objects excavated from 1994 to 2008 in the peat of an open-pit coal mine near Schöningen, in northern Germany. The rich haul included two dozen complete or fragmented spears (each about as tall as an N.B.A. center) and double-pointed throwing sticks (half the length of a pool cue) but no hominid bones. The objects date from the end of a warm interglacial period 300,000 years ago, about when early Neanderthals were supplanting Homo heidelbergensis, their immediate predecessors in Europe. The projectiles unearthed at the Schöningen site, known as Spear Horizon, are considered the oldest preserved hunting weapons.

For the full story see:

Franz Lidz. “In the Stone Age, Wood Was Pivotal, a Study Says.” The New York Times (Tuesday, May 7, 2024): D3.

(Note: the online version of the story was updated May 6, 2024, and has the title “Was the Stone Age Actually the Wood Age?”)

Terberger’s co-authored academic paper mentioned above is:

Leder, Dirk, Jens Lehmann, Annemieke Milks, Tim Koddenberg, Michael Sietz, Matthias Vogel, Utz Böhner, and Thomas Terberger. “The Wooden Artifacts from Schöningen’s Spear Horizon and Their Place in Human Evolution.” Proceedings of the National Academy of Sciences 121, no. 15 (2024): e2320484121.