Tom Watson, Jr. Managed IBM’s Rare and Successful Self-Disruption by “Transitioning the Firm to Electronic Computing”

(p. 9) Thomas J. Watson Jr. seemed, from a young age, to be destined for failure.

. . .

“He played with fire, shot animals in the nearby swamps and pilfered things from neighbors’ houses,” Ralph Watson McElvenny and Marc Wortman write in “The Greatest Capitalist Who Ever Lived,” a compelling new biography of Watson Jr.

. . .

This is far from the first book about IBM.

. . .

But this is probably the most theatrical book about IBM ever published. McElvenny, who happens to be Watson Jr.’s eldest grandson, is privy to “personal and corporate papers” and, as the endnotes mysteriously specify, many “family sources.”

. . .

“The Greatest Capitalist Who Ever Lived” is about the challenges of corporate and family succession, an essential topic given that IBM itself was the father figure to most of the computing and tech industry. Watson Sr., “the old man,” was a type familiar to our times: the tech titan who runs a large company as an extension of himself. (The IBM machine that beat the “Jeopardy!” champion Ken Jennings bears his name.) For four decades, IBM was Watson Sr.’s fief. The company “was run entirely out of one man’s breast pocket,” McElvenny and Wortman write. Watson Sr. “made all strategic decisions and most minor ones” and “delegated almost no authority.”

To his lasting credit, he did truly take care of his employees and their families in a manner that bred a strong loyalty. That said, Watson Sr. demanded conformity and could be erratic and cruel.

. . .

IBM faced a classic version of what the Harvard Business School professor Clayton Christensen has termed the “innovator’s dilemma” and what the Nobel Prize-winning economist Kenneth Arrow described as a monopoly’s disinclination to innovate. IBM was making plenty of profit on punched cards and accounting machines, its customers were happy, so why rock the boat?

Watson Jr.’s intense antipathy toward his father ended up saving IBM. Just before the United States entered World War II, Junior gained self-confidence the old-fashioned way: by joining the Army Air Corps and flying a B-24. When he eventually returned to IBM (pushed to do so by his commanding officer, Maj. Gen. Follett Bradley, who thought Watson would be wasted as an airline pilot), he became the internal champion of transitioning the firm to electronic computing. He was perhaps the only person who could oppose his father in a company built on yes men.

While the book’s title calls him “the greatest capitalist,” it might more accurately, if less ringingly, call him “the greatest manager,” for Watson Jr. was much better at delegating and using his employees’ talents.

For the full review, see:

Tim Wu. “Next-Gen.” The New York Times Book Review (Sunday, December 17, 2023): 9.

(Note: ellipses added.)

(Note: the online version of the review was updated Dec. 15, 2023, and has the title “The Father-Son Struggle That Helped Ensure IBM’s Success.”)

The book under review is:

McElvenny, Ralph Watson, and Marc Wortman. The Greatest Capitalist Who Ever Lived: Tom Watson Jr. and the Epic Story of How IBM Created the Digital Age. New York: PublicAffairs, 2023.

Charlie Munger Had “Epistemic Humility,” Endorsing Confucius’s Claim “That Real Knowledge Is Knowing the Extent of One’s Ignorance”

Epistemic humility is honest and useful, but is often punished. We often admire the confident, whether their confidence is justified or not. But I do not agree with Confucius–we can have real knowledge beyond knowing we are very ignorant.

(p. B1) I had the extraordinary good luck to get to know Charlie Munger in the past two decades.

. . .

More than almost anyone I’ve ever known, Munger also possessed what philosophers call epistemic humility: a profound sense of how little anyone can know and how important it is to open and change your mind.

. . .

(p. B4) Munger—who graduated magna cum laude from Harvard Law School without ever earning a college degree—knew perfectly well how smart he was. And it is an understatement to say he didn’t suffer fools gladly. In an interview with The Wall Street Journal in 2019, he used the phrase “massively stupid” at least seven times to describe other people and even entire professions.

So was he a cocky, cranky old man yelling at the clouds?

No. If there was one thing Munger knew, it was himself. As he told me in 2014, “Confucius said that real knowledge is knowing the extent of one’s ignorance . . . .  Knowing what you don’t know is more useful than being brilliant.”

For the full commentary, see:

Jason Zweig. “THE INTELLIGENT INVESTOR; Charlie Munger’s Reflections on His Life, Luck and Success.” The Wall Street Journal (Saturday, Dec. 2, 2023): B1 & B4.

(Note: ellipses between paragraphs added; ellipsis internal to the penultimate quoted paragraph in original.)

(Note: the online version of the commentary has the date November 29, 2023, and has the title “THE INTELLIGENT INVESTOR; Charlie Munger’s Life Was About Way More Than Money.”)

“Adding Manpower to a Late Software Project Makes It Later”

(p. 24) Dr. Brooks had a wide-ranging career that included creating the computer science department at the University of North Carolina and leading influential research in computer graphics and virtual reality.

But he is best known for being one of the technical leaders of IBM’s 360 computer project in the 1960s.

. . .

Until the 360, each model of computer had its own bespoke hardware design. That required engineers to overhaul their software programs to run on every new machine that was introduced.

But IBM promised to eliminate that costly, repetitive labor with an approach championed by Dr. Brooks, a young engineering star at the company, and a few colleagues. In April 1964, IBM announced the 360 as a family of six compatible computers. Programs written for one 360 model could run on the others, without the need to rewrite software, as customers moved from smaller to larger computers.

. . .

The hard-earned lessons he learned from grappling with the OS/360 software became grist for his book “The Mythical Man-Month: Essays on Software Engineering.” First published in 1975, it became recognized as a quirky classic, selling briskly year after year and routinely cited as gospel by computer scientists.

The tone is witty and self-deprecating, with pithy quotes from Shakespeare and Sophocles and chapter titles like “Ten Pounds in a Five-Pound Sack” and “Hatching a Catastrophe.” There are practical tips along the way. For example: Organize engineers on big software projects into small groups, which Dr. Brooks called “surgical teams.”

The most well known of his principles was what he called Brooks’s law: “Adding manpower to a late software project makes it later.”

Dr. Brooks himself acknowledged that with the “law” he was “oversimplifying outrageously.” But he was exaggerating to make a point: It is often smarter to rethink things, he suggested, than to add more people. And in software engineering, a profession with elements of artistry and creativity, workers are not interchangeable units of labor.

In the internet era, some software developers have suggested that Brooks’s law no longer applies. Large open-source software projects — so named because the underlying “source” code is open for all to see — have armies of internet-connected engineers to spot flaws in code and recommend fixes. Still, even open-source projects are typically governed by a small group of individuals, more surgical team than the wisdom of the crowd.

For the full obituary, see:

Steve Lohr. “Frederick P. Brooks Jr., an Innovator of Computer Design, Dies at 91.” The New York Times, First Section (Sunday, November 27, 2022): 24.

(Note: ellipses added.)

(Note: the online version of the obituary was updated Nov. 25, 2022, and has the title “Frederick P. Brooks Jr., Computer Design Innovator, Dies at 91.”)

The Brooks’s book mentioned above is:

Brooks, Frederick P., Jr. The Mythical Man-Month: Essays on Software Engineering. 2nd ed. Boston, MA: Addison-Wesley, 1995.

P&G CEO Defended Using Harsh Criticism of Workers

Deirdre McCloskey frequently says we should use more “sweet talk.” Edwin Artzt defended using harsh talk. Is there room for both?

(p. A8) Edwin Artzt, who expanded Procter & Gamble Co.’s global reach in the 1980s and then, as chief executive officer in the early 1990s, rattled the company’s managers with cost-cutting drives and harsh criticism of their work, died at the age of 92, the Cincinnati-based company said.

As CEO from 1990 until 1995, Mr. Artzt was known for berating managers and using words including “stupid” and “imbecilic” to describe some of their proposals, as recounted in “Soap Opera: The Inside Story of Procter & Gamble,” a 1993 book by Alecia Swasy, a former Wall Street Journal reporter. He didn’t sugarcoat his desire to eliminate weak brands and underperforming employees.

Mr. Artzt, who died on April 6, was sometimes called “The Prince of Darkness.” Some colleagues said the nickname reflected a hot temper. He said it came from his habit of working late.

“I certainly don’t want to have a short trigger with people and not give them a chance,” he told The Wall Street Journal in 1991. “But sure I’ve cleared out deadwood. Probably some of it was still breathing when it was cleared out.”

Two years later, he said: “Terrifying people is not my intention…People come to me years later and say, ‘Remember that meeting 10 years ago? You laid it on me, but I sure remember that lesson.’”

For the full obituary, see:

James R. Hagerty. “P&G CEO’s Harsh Talk Rattled a Bureaucracy.” The Wall Street Journal (Saturday, April 15, 2023): A10.

(Note: the online version of the obituary was updated April 12, 2023, and has the title “Edwin L. Artzt, P&G CEO Known for His Tough Talk, Dies at 92.”)

The book on Proctor & Gamble mentioned above is:

Swasy, Alecia. Soap Opera: The Inside Story of Proctor & Gamble. New York: Crown Publishing, 1993.

“You Will Do Your Best Creative Work by Yourself”

(p. A12) The value of gathering to swap loosely formed thoughts is highly suspect, despite being a major reason many companies want workers back in offices.

“You do not get your best ideas out of these freewheeling brainstorming sessions,” says Sheena Iyengar, a professor at Columbia Business School. “You will do your best creative work by yourself.”

Iyengar has compiled academic research on idea generation, including a decade of her own interviews with more than a thousand people, into a book called “Think Bigger.” It concludes that group brainstorming is usually a waste of time.

Pitfalls include blabbermouths with mediocre suggestions and introverts with brilliant ones that they keep to themselves.

. . .

Plenty of people have always bemoaned brainstorming. Longtime Wall Street Journal readers may recall a 2006 “Cubicle Culture” column that skewered the popular practice, and Harvard Business Review published a research-based case against the usefulness of brainstorming in 2015.

. . .

Sometimes leaders bring employees together to create the illusion of wide-open input, says Erika Hall, co-founder of Mule Design Studio, a management consulting firm in San Francisco. In-person brainstorming is part of the back-to-office rationale for many of her clients, and she generally advises the ones that truly want to improve collaboration to first carve out some alone time for their workers.

When Hall needs inspiration, she goes for a run.

“It’s freaky,” she says. “I will go run on a problem, and things will happen in my head that do not happen under any other circumstance.”

Others might find “Aha!” moments in the shower or while listening to music. Leaving breakthroughs to private serendipity can feel, to bosses, like losing control, she acknowledges, but it might be more effective than trying to schedule magic in a conference room.

For the full commentary, see:

Callum Borchers. “ON THE CLOCK; Switch Off Brainstorming If You Want Brighter Ideas.” The Wall Street Journal (Thursday, May 18, 2023): A12.

(Note: ellipses added.)

(Note: the online version of the commentary was updated May 18, 2023, and has the title “ON THE CLOCK; Office Brainstorms Are a Waste of Time.”)

The book by Iyengar mentioned above is:

Iyengar, Sheena. Think Bigger: How to Innovate. New York: Columbia Business School Publishing, 2023.

Experienced Nurses Can Be Disciplined If They Use Hunches from Clinical Observations to Override AI Protocols

(p. A1) Melissa Beebe, an oncology nurse, relies on her observation skills to make life-or-death decisions. A sleepy patient with dilated pupils could have had a hemorrhagic stroke. An elderly patient with foul-smelling breath could have an abdominal obstruction.

So when an alert said her patient in the oncology unit of UC Davis Medical Center had sepsis, she was sure it was wrong. “I’ve been working with cancer patients for 15 years so I know a septic patient when I see one,” she said. “I knew this patient wasn’t septic.”

The alert correlates elevated white blood cell count with septic infection. It wouldn’t take into account that this particular patient had leukemia, which can cause similar blood counts. The algorithm, which was based on artificial intelligence, triggers the alert when it detects patterns that match previous patients with sepsis. The algorithm didn’t explain (p. A9) its decision.

Hospital rules require nurses to follow protocols when a patient is flagged for sepsis. While Beebe can override the AI model if she gets doctor approval, she said she faces disciplinary action if she’s wrong. So she followed orders and drew blood from the patient, even though that could expose him to infection and run up his bill. “When an algorithm says, ‘Your patient looks septic,’ I can’t know why. I just have to do it,” said Beebe, who is a representative of the California Nurses Association union at the hospital.

As she suspected, the algorithm was wrong. “I’m not demonizing technology,” she said. “But I feel moral distress when I know the right thing to do and I can’t do it.”

. . .

In a survey of 1,042 registered nurses published this month by National Nurses United, a union, 24% of respondents said they had been prompted by a clinical algorithm to make choices they believed “were not in the best interest of patients based on their clinical judgment and scope of practice” about issues such as patient care and staffing.” Of those, 17% said they were permitted to override the decision, while 31% weren’t allowed and 34% said they needed doctor or supervisor’s permission.

. . .

Jeff Breslin, a registered nurse at Sparrow Hospital in Lansing, Mich., has been working at the Level 1 trauma center since 1995. He helps train new nurses and students on what signs to look for to assess and treat a critically ill or severely injured patient quickly.

“You get to a point in the profession where you can walk into a patient’s room, look at them and know this patient is in trouble,” he said. While their vital signs might be normal, “there are thousands of things we need to take into account,” he said. “Does he exhibit signs of confusion, difficulty breathing, a feeling of impending doom, or that something isn’t right?”

. . .

Nurses often describe their ability to sense a patient’s deterioration in emotional terms. “Nurses call it a ‘hunch,’ ” said Cato, the University of Pennsylvania professor who is also a data scientist and former nurse. “It’s something that causes them to increase surveillance of the patient.”

. . .

At UC Davis earlier this spring, Beebe, the oncology nurse, was treating a patient suffering from a bone cancer called myeloid leukemia. The condition fills the bones with cancer cells, “they’re almost swelling with cancer,” she said, causing excruciating pain. Seeing the patient wince, Beebe called his doctor to lobby for a stronger, longer-lasting pain killer. He agreed and prescribed one, which was scheduled to begin five hours later.

To bridge the gap, Beebe wanted to give the patient oxycodone. “I tell them, ‘Anytime you’re in pain, don’t keep quiet. I want to know.’ There’s a trust that builds,” she said.

When she started in oncology, nurses could give patients pain medication at their discretion, based on patient symptoms, within a doctor’s parameters. They gave up authority when the hospital changed its policies and adopted a tool that automated medication administration with bar-code scanners a few years ago.

In its statement, UC Davis said the medication tool exists as a second-check to help prevent human error. “Any nurse who doesn’t believe they are acting in the patient’s best interests…has an ethical and professional obligation to escalate those concerns immediately,” the hospital said.

Before giving the oxycodone, Beebe scanned the bar code. The system denied permission, adhering to the doctor’s earlier instructions to begin the longer-acting pain meds five hours later. “The computer doesn’t know the patient is in out-of-control pain,” she said.

Still, she didn’t act. “I know if I give the medication, I’m technically giving medication without an order and I can be disciplined,” she said. She watched her patient grimace in pain while she held the pain pill in her hand.

For the full story, see:

Lisa Bannon. “Nurses Clash With AI Over Patient Care.” The Wall Street Journal (Friday, June 16, 2023): A1 & A9.

(Note: ellipses added.)

(Note: the online version of the story has the date June 15, 2023, and has the title “When AI Overrules the Nurses Caring for You.”)

Did Theranos Fail Because It Had a Flat Structure or Because It Had a Hierarchy with Holmes at the Top (Or Simply Because They Failed at Something Very Hard)?

André Spicer and Elizabeth Holmes infer that Theranos failed due to its flat structure. But weren’t there some employees, such as George Shultz’s grandson, whose efforts to identify the problems that led to failure and fraud, were suppressed by Elizabeth Holmes? If so, then can’t you say that the failure was due to Holmes’s power at the top of the firm? Meaning due to a kind of hierarchy rather than due to flatness? (I remain unclear and conflicted on whether and when flatness or hierarchy is better.)

(p. B2) . . . do flat structures work? André Spicer, a professor of organizational behavior at the Bayes Business School in London, said that, while the “cultural zeitgeist when I was growing up was that hierarchies are bad,” there’s been an increasing recognition of both the need for them and the fact that they often reappear in businesses ‌that, at least theoretically, reject them.

. . .

Mr. Spicer is particularly critical of start-ups that have attempted, or claimed to attempt, flat structures, suggesting that failures — and at least one major scandal — have emerged from these workplaces. He pointed to Elizabeth Holmes and Theranos, her health care technology start-up. In a 2015 interview, Ms. Holmes said that Theranos was “a very flat organization and if I have learned anything, we are only as good as the worst people on our team.”

“The claim that companies like Theranos had a flat structure meant the company fitted into a well-recognized type of agile tech firms,” Mr. Spicer said. In addition to attracting investors and employees, the myth “meant that these companies don’t have to do the difficult and tedious process of putting into place all the systems and controls you would normally find.”

He added that he believed those systems “would have likely stopped much of the wrongdoing.” Ms. Holmes and Ramesh Balwani, the former chief operating officer of Theranos, were each recently sentenced to prison time for defrauding investors and patients.

The notion that start-ups in particular are ill suited to a flat structure was supported in a 2021 study by Professor Lee of Wharton. A flat structure “can result in haphazard execution and commercial failure by overwhelming managers with the burden of direction and causing subordinates to drift into power struggles and aimless idea explorations,” he wrote.

For the full story, see:

Charlie Brinkhurst-Cuff. “‘Flat’ Company Structures Sound Appealing. But Do They Work?” The New York Times (Wednesday, July 5, 2023): B2.

(Note: ellipses added.)

(Note: the online version of the story has the same date as the print version, and has the title “In Business, ‘Flat’ Structures Rarely Work. Is There a Solution?” Where there are minor differences in wording between the versions, the passages quoted above follow the online version.)

The academic paper by Lee mentioned in the passage quoted above is:

Lee, Saerom. “The Myth of the Flat Start-Up: Reconsidering the Organizational Structure of Start-Ups.” Strategic Management Journal 43, no. 1 (Jan. 2022): 58-92.

For Musk “Hard Core” Means “Long Hours at High Intensity”

(p. A24) Have you ever gotten an email at midnight from the boss with ​an ominous subject line like “a fork in the road”? Granted, email etiquette today says we’re not supposed to get midnight emails from bosses at all. But Elon Musk is no ordinary boss, and it’s safe to assume he didn’t get the memo on empathetic leadership. So, true to form, as chief executive of Twitter, after laying off nearly half of his staff, bringing a sink to work and proclaiming he would be sleeping at the office “until the org is fixed,” Mr. Musk recently issued this late-night ultimatum to his remaining employees: From this point forward, Twitter was going to be “extremely hard core.” Were they ready to be hard core? They could select “yes” — or opt for three months of severance pay.

To Mr. Musk, “hard core” meant “long hours at high intensity,” a workplace where only the most “exceptional performance” would be accepted and a culture in which midnight emails would be just fine. I’d wager that more than a few workaholics, bosses or otherwise, weren’t entirely turned off by the philosophy behind that statement, and yet it immediately conjured images of sweaty Wall Street bankers collapsing at their desks, Silicon Valley wunderkinds sleeping under theirs and the high-intensity, bro-boss cultures of companies like Uber and WeWork, with their accompanying slogans about doing what you love and sleeping when you’re dead.

For the full commentary, see:

Jessica Bennett. “Elon, the Mosh Pit Called. It Wants ‘Hard Core’ Back.” The New York Times (Friday, November 25, 2022): A24.

(Note: the online version of the commentary has the date Nov. 23, 2022, and has the title “The Worst Midnight Email From the Boss, Ever.”)

Some High Performers Find Ways to Avoid Accumulating Microstresses

(p. C5) Have you had days that exhaust you extraordinarily without any particular reason why?

. . .

There’s a common but little-understood reason for that exhaustion. We call it “microstress”—brief, frequent moments of everyday tension that accumulate and impede us even though we don’t register them.

. . .

One study published in the journal Biological Psychology in 2015 found that exposure to social stress within two hours of a meal leads your body to metabolize the food in a way that adds 104 calories on average. “If this happens daily, that’s 11 pounds gained per year,” noted Lisa Feldman Barrett, a psychology professor at Northeastern University and author of “Seven and a Half Lessons About the Brain.”

. . .

In our research, we observed that some of the high performers—a small subset that we came to call the “Ten Percenters”—were much better at coping with microstress than the rest of those we studied, and perhaps than the rest of us, too. What do they do differently?

. . .

. . ., they’re better at removing themselves from interactions that generate microstress in their lives, whether or not they realize the dynamic. Ten Percenters are more likely to shape these interactions by dealing with simmering disagreements head-on or by limiting such contacts.

. . .

Our Ten Percenters were also thoughtful about not creating the kinds of conditions that cause microstress for others. Think about what happens—to both of you—when you push your child too hard on their grades and it comes back in the form of a rebellious attitude. Or the stress you may create as a manager by unnecessarily shifting expectations. Stopping this cycle helps to prevent microstress from boomeranging back on us.

For the full essay, see:

Rob Cross and Karen Dillon. “Combating the ‘Microstress’ That Causes Burnout.” The Wall Street Journal (Saturday, April 22, 2023): C5.

(Note: ellipses added.)

(Note: the online version of the essay has the date April 21, 2023, and has the same title as the print version.)

The essay quoted above is adapted from Cross and Dillon’s book:

Cross, Rob, and Karen Dillon. The Microstress Effect: How Little Things Pile Up and Create Big Problems—and What to Do About It. Boston, MA: Harvard Business Review Press, 2023.

Tim Cook’s Apple Is Silent on Communist China’s Suppression of Human Rights

(p. A19) Apple CEO Tim Cook has been taking a beating over his company’s coziness with Beijing. It comes amid protests across China against the government’s strict Covid-19 lockdowns, including at a factory in Zhengzhou where most of the world’s iPhones are made. Hillary Vaughn of Fox News perfectly captured Mr. Cook’s embarrassment on Capitol Hill Thursday [Dec. 1, 2022] when she peppered him with questions:

“Do you support the Chinese people’s right to protest? Do you have any reaction to the factory workers that were beaten and detained for protesting Covid lockdowns? Do you regret restricting AirDrop access that protesters used to evade surveillance from the Chinese government? Do you think it’s problematic to do business with the Communist Chinese Party when they suppress human rights?”

A stone-faced Mr. Cook responded with silence.

. . .

CEOs can always justify their operations by pointing to the economic benefits their companies bring to the communities in which they operate. Or CEOs can go the progressive route, presenting their companies as moral paragons. But they can’t have it both ways: holding themselves up as courageous in places where the risk from speaking out is low while keeping quiet about real oppression in places where speaking out can really hurt the bottom line.

For the full commentary, see:

William McGurn. “MAIN STREET; Tim Cook’s Bad Day on China.” The Wall Street Journal (Tuesday, Dec. 6, 2022): A19.

(Note: ellipsis, and bracketed date, added.)

(Note: the online version of the commentary has the date December 5, 2022, and has the same title as the print version.)

The Role Disney “Fans Play in Creating the Disney Magic”

(p. 10) On Nov. 20, [2022],I was relieved to hear the news that Disney’s chief executive, Bob Chapek, had been fired and replaced with the former chief executive Robert Iger. The news was also met with near-unanimous celebration among my community of super fans.

While his ouster shocked investors and Hollywood, many in our community had been actively campaigning for Mr. Chapek’s firing for the past two years. A Change.org petition to fire Mr. Chapek that started in 2020 garnered over 117,000 signatures. (It now reads “Victory.”) Online forums teemed with complaints about Mr. Chapek’s management style and strategy.

. . .

We also pushed to have Mr. Chapek fired because he didn’t believe in Disney magic. Disney is so much more than just another big business. Understanding that is crucial to its success.

When Walt Disney opened Disneyland, he referred to his theme park customers as “guests,” an understanding that is explicitly reinforced in Disney employee training to this day, and by which Disney’s theme park community refers to itself.

. . .

What Mr. Chapek doesn’t understand is the role we fans play in creating the Disney magic. It is our Instagram accounts, our blogs and our websites that those out-of-towners refer to in order to prepare for that revenue-generating Disneyland trip. I get paid to do it, but many others do this work just because they love it. Mr. Chapek disregarded us.

Worse was the way Mr. Chapek treated “cast members,” as Disney’s park employees are known. The people who greet you at the park entrance, serve you food and get you safely on and off the rides have an enormous influence on the quality of your visit. I’ve talked to many cast members, from young people to older adults, about why they’re willing to wear polyester costumes in Florida’s summer heat for relatively low wages. To a person, they say something like, “I want to make people happy, and Disney is the best place to do that.”

So it was disheartening when, in September 2020, Mr. Chapek announced that the company was laying off 28,000 workers, most of them cast members. While many other businesses were laying off workers during that time, Mr. Chapek was also committing Disney to spending billions to ramp up content production for its Disney+ streaming service. As we saw it, Mr. Chapek viewed the incomes and health care of thousands of people — the people who make the magic — as less important than another season of “The Mandalorian.” Many cast members decided not to return to Disney’s parks when they reopened.

For the full commentary, see:

Len Testa. “Bob Chapek Didn’t Believe in Disney Magic.” The New York Times, SundayOpinion Section (Sunday, December 4, 2022): 10.

(Note: ellipses, and bracketed year, added.)

(Note: the online version of the commentary has the date Nov. 29, 2022, and has the same title as the print version. Where there is a slight difference in wording between versions, the passages quoted above follow the online version.)