Naps Aid Immunity, Energy, Alertness, Memory, and Mood

(p. D4) Sara E. Alger, a sleep scientist at the Walter Reed Army Institute of Research in Silver Spring, Md., has been a public advocate for naps, particularly in the workplace, except in cases of insomnia. Along the way, she has had to fight anti-nap prejudice.

“Naps in general have a stigma attached to them as something you only do when you’re lazy or when you’re sick,” Dr. Alger said.

Wrapped inside nap phobia in the United States is often a message reminding us to be productive during what we now think of as normal working hours, although that concept is relatively new.

Modern attitudes about napping go back to the Industrial Revolution, according to Matthew J. Wolf-Meyer, an anthropologist at Binghamton University in New York and the author of “The Slumbering Masses: Sleep, Medicine, and Modern American Life.”

“For a long time, people had flexible sleep schedules,” Dr. Wolf-Meyer said. Farmers and tradespeople had some autonomy over their time. They could choose to rest in the hottest part of the day, and might take up simple tasks during a wakeful period in the middle of the night, between two distinct bouts of sleep.

As the 1800s went on, more and more Americans worked in factories on set shifts that were supervised by a foreman. “They work for a total stranger, and a nap becomes totally nonnegotiable,” he said.

Staying awake all day and getting one’s sleep in a single long stretch at night came to be seen as normal. With that came a strong societal expectation that we ought to use our daylight hours productively.

. . .

Although there are no hard data so far on whether naps have been on the rise during 2020, sleep scientists like Dr. Alger think it’s likely. The many people who now work remotely no longer need to worry about the disapproving eyes of their colleagues if they want a brief, discreet period of horizontality in the afternoons.

If most offices reopen next year, as now seems possible, perhaps greater tolerance toward the adult nap will be one of the things salvaged from the smoking wreckage of the working-from-home era. (In a tweet last week, Dr. Wolf-Meyer called the pandemic “the largest (accidental) experiment with human #sleep ever conducted.”) . . .

Experts say that people who get seven to nine hours of sleep a day are less prone to catching infectious diseases, and better at fighting off any they do catch. Afternoon sleep counts toward your daily total, according to Dr. Alger.

This immunity boost, she said, is in addition to other well-known dividends of a good nap, like added energy, increased alertness, improved mood and better emotional regulation.

Included under the last rubric is a skill that seems especially useful for dealing with families, even if you never get closer to your relatives this year than a “Hollywood Squares”-style video grid: “Napping helps you be more sensitive to receiving other people’s moods,” Dr. Alger said. “So you’re not perceiving other people as being more negative than they are.”

Napping also helps you remember facts you learned right before nodding off. Given the way things have been going lately, of course, you may not see this as a plus. You could look at it from the reverse angle, though: Every hour before Jan. 1 that you spend napping is another hour of 2020 you won’t remember.

For the full commentary, see:

Pete Wells. “This Thanksgiving, Nap Without Guilt.” The New York Times (Wednesday, November 25, 2020): D1 & D4.

(Note: ellipses added.)

(Note: the online version of the commentary has the date Nov. 24, 2020, and has the title “This Thanksgiving, It’s Time to Stop Nap-Shaming.”)

The book by Wolf-Meyer, mentioned above, is:

Wolf-Meyer, Matthew J. The Slumbering Masses: Sleep, Medicine, and Modern American Life. Minneapolis, MN: University of Minnesota Press, 2012.

New York City’s Resilient Dynamism

(p. C10) Do you worry that New York won’t fully return to what it was before the pandemic?

LEBOWITZ I have lived in New York long enough to know that it will not stay the way it is now. There is not a square foot of New York City, a square foot, that’s the same as it was when I came here in 1970. That’s what a city is, even without a plague. But I’d like to point out, there were many things wrong with it before. After the big protests in SoHo, I saw a reporter interviewing a woman who was a manager of one of the fancy stores there. The reporter said to her, “What are you going to do?” And she said, “There’s nothing we can do until the tourists come back.” I yelled at the TV and I said, “Really? You can’t think what to do with SoHo without tourists? I can! Let me give you some ideas.” Because I remember it without tourists. How about, artists could live there? How about, let’s not have rent that’s $190,000 a month? How about that? Let’s try that.

For the full interview, see:

Dave Itzkoff, interviewer. “More of Her Metropolitan Life.” The New York Times (Friday, January 8, 2021): C1 & C10.

(Note: the online version of the interview has the date Jan. 7, 2020, and has the title “Fran Lebowitz and Martin Scorsese Seek a Missing New York in ‘Pretend It’s a City’.” In the online and print versions the question by Itzkoff, and Lebowitz’s name before her answer, were in bold.)

Abramson “Was Too Busy Surfing” to Patent Wireless Networking

I argue that patents enable funding for poor inventors or inventors who aspire to big expensive breakthroughs. If you have independent means (like a professorship in Hawaii) and mainly aspire to surf, you can afford to ignore patents.

(p. B11) Professor Abramson has been called the father of wireless networking. But it was a shared paternity. The project included graduate students and several faculty members, notably Frank Kuo, a former Bell Labs scientist who came to the University of Hawaii in 1966, the same year Professor Abramson arrived.

His deepest expertise was in communication theory, the subject of his Ph.D. thesis at Stanford University. The fundamental design ideas behind ALOHAnet were his. In a 2018 oral history interview for the Computer History Museum, Professor Kuo recalled, “Norm was the theory and I was the implementer, and so we worked together pretty well.”

. . .

That the ALOHAnet technology became so widely used was partly because Professor Abramson and his team had shared it freely and welcomed other scientists to Hawaii.

“We had done no patenting, and ALOHA was published in scientific papers,” putting their work in the public domain, Professor Abramson said in the oral history, adding: “And that was fine with me. I was too busy surfing to worry about that sort of thing.”

. . .

Some of the data-networking techniques developed by Professor Abramson and his Hawaii team proved valuable not only in wireless communications but also in wired networks. One heir to his work was Robert Metcalfe, who in 1973 was a young computer scientist working at Xerox PARC, a Silicon Valley research laboratory that had become a fount of personal computer innovations.

Mr. Metcalfe was working on how to enable personal computers to share data over wired office networks. He had read a 1970 paper, written by Professor Abramson, describing ALOHAnet’s method for transmitting and resending data over a network.

“Norm kindly invited me to spend a month with him at the University of Hawaii to study ALOHAnet,” Mr. Metcalfe recalled in an email.

Mr. Metcalfe and his colleagues at Xerox PARC adopted and tweaked the ALOHAnet technology in creating Ethernet office networking. Later, Mr. Metcalfe founded an Ethernet company, 3Com, which thrived as the personal computer industry grew.

“Norm, thank you,” Mr. Metcalfe concluded in his email. “Aloha!”

For the full obituary, see:

Steve Lohr. “Norman Abramson, a Pioneer Behind Wireless Networking, Is Dead at 88.” The New York Times (Saturday, December 12, 2020): B11.

(Note: ellipses added.)

(Note: the online version of the obituary has the date Dec. 11, 2020, and has the title “Norman Abramson, Pioneer Behind Wireless Networks, Dies at 88.”)

Early Animation “Followed Only One Rule”: “Anything Goes”

(p. C5) The story of Disney Studios is a central strand in Mitenbuler’s narrative; Disney became the formidable force that the other animation studios would look toward, compete with and rail against. Max Fleischer, whose studio was responsible for the likes of Popeye and Betty Boop, groused that Disney’s “Snow White,” released in 1937, was “too arty.”  . . .  The wife of one of the Fleischer brothers, though, said they had better watch out: “Disney is doing art, and you guys are still slapping characters on the butt with sticks!”

But what if those slapped butts were part of what had made animation so revolutionary in the first place? Mitenbuler suggests as much, beginning “Wild Minds” with the early days of animation, in the first decades of the 20th century, when the technology of moving pictures was still in its infancy. Like the movie business in general, the field of animation contained few barriers to entry, and a number of Jewish immigrants shut out from other careers found they could make a decent living working for a studio or opening up their own. Even Disney, who grew up in the Midwest, was an outsider without any connections.

The work created in those early decades was often gleefully contemptuous of anything that aspired to good taste. Until the movie studios started self-censoring in the early ’30s, in a bid to avoid government regulation, animators typically followed only one rule to the letter: Anything goes.

For the full review, see:

Jennifer Szalai. “BOOKS OF THE TIMES: Ehh, What’s Animation, Doc?” The New York Times (Thursday, December 17, 2020): C5.

(Note: ellipsis added.)

(Note: the online version of the review has the date Dec. 16, 2020, and has the title “BOOKS OF THE TIMES: ‘Fantasia,’ ‘Snow White,’ Betty Boop, Popeye and the First Golden Age of Animation.”)

The book under review is:

Mitenbuler, Reid. Wild Minds: The Artists and Rivalries That Inspired the Golden Age of Animation. New York: Atlantic Monthly Press, 2020.

Communist Dictator Kim Jong-un Is “Really Sorry” for North Koreans’ Economic Suffering

(p. A10) “Our five-year economic development plan has fallen greatly short of its goals in almost all sectors,” Mr. Kim said in his opening speech to the ruling Workers’ Party’s eighth congress that began in Pyongyang, the capital, on Tuesday [Jan. 5, 2021].

. . .

. . . he had little to show on the economic front. He apologized to his people for failing to live up to their expectations. “I am really sorry for that,” he said, appearing to fight back tears. “My efforts and sincerity have not been sufficient enough to rid our people of the difficulties in their life.”

For the full story, see:

Choe Sang-Hun. “Kim Admits That He’s Failed In His 5-Year-Plan to Rebuild North Korea’s Feeble Economy.” The New York Times (Thursday, January 7, 2021): A10.

(Note: ellipses, and bracketed date, added.)

(Note: the online version of the story was updated Jan. 15, 2021, and has the title “North Korea Party Congress Opens With Kim Jong-un Admitting Failures.”)

Wheaton Economist Seth Norton Reviews Openness to Creative Destruction

Seth Norton wrote a thorough, gracious, and enthusiastic review of my book Openness to Creative Destruction: Sustaining…

Posted by Arthur Diamond on Sunday, February 7, 2021

My book is:

Diamond, Arthur M., Jr. Openness to Creative Destruction: Sustaining Innovative Dynamism. New York: Oxford University Press, 2019.

“Increasing Minimum Wages Can Cause Some Job Loss”

(p. B6) Increasing the minimum wage could lead employers to lay off some workers in order to pay others more, said David Neumark, an economics professor at the University of California, Irvine.

“There’s a ton of research that says increasing minimum wages can cause some job loss,” he said. “Plenty workers are helped, but some are hurt.”

A 2019 Congressional Budget Office study found that a $15 federal minimum wage would increase pay for 17 million workers who earned less than that and potentially another 10 million workers who earned slightly more. According to the study’s median estimate, it would cause 1.3 million other workers to lose their jobs.

For the full story, see:

Gillian Friedman. “Base Wage Of $15 Gains In Popularity Across U.S.” The New York Times (Friday, January 1, 2021): B1 & B6.

(Note: the online version of the story has the date Dec. 31, 2020, and has the title “Once a Fringe Idea, the $15 Minimum Wage Is Making Big Gains.”)

“Hillbilly Elegy” Book (but Not the Movie) Suggests a “Culture of Poverty”

(p. C3) “Hillbilly Elegy,” published in June of 2016, attracted an extra measure of attention (and controversy) after Donald Trump’s election. It seemed to offer a firsthand report, both personal and analytical, on the condition of the white American working class.

And while the book didn’t really explain the election — Vance is reticent about his family’s voting habits and ideological tendencies — it did venture a hypothesis about how that family and others like it encountered such persistent household dysfunction and economic distress. His answer wasn’t political or economic, but cultural.

He suggests that the same traits that make his people distinctive — suspicion of outsiders, resistance to authority, devotion to kin, eagerness to fight — make it hard for them to thrive in modern American society. Essentially, “Hillbilly Elegy” updates the old “culture of poverty” thesis associated with the anthropologist Oscar Lewis’s research on Mexican peasants (and later with Daniel Patrick Moynihan’s ideas about Black Americans) and applies it to disadvantaged white communities.

Howard and Taylor mostly sidestep this argument, which has been widely criticized. They focus on the characters and their predicaments, and on themes that are likely to be familiar and accessible to a broad range of viewers. The film is a chronicle of addiction entwined with a bootstrapper’s tale — Bev’s story and J.D.’s, with Mamaw as the link between them.

But it sacrifices the intimacy, and the specificity, of those stories by pretending to link them to something bigger without providing a coherent sense of what that something might be. The Vances are presented as a representative family, but what exactly do they represent? A class? A culture? A place? A history? The louder they yell, the less you understand — about them or the world they inhabit.

For the full movie review, see:

A.O. Scott. “I Remember Bev and Mamaw.” The New York Times (Friday, November 27, 2020): C3.

(Note: the online version of the review has the date Nov. 23, 2020, and has the title “‘Hillbilly Elegy’ Review: I Remember Mamaw.”)

J.D. Vance’s book is:

Vance, J. D. Hillbilly Elegy: A Memoir of a Family and Culture in Crisis. New York: HarperCollins Publishers, 2016.

Communist Dictatorship Was Not Inevitable in Russia in 1917

(p. 14) A professor at Bard College, McMeekin argues that one of the seminal events of modern history was largely a matter of chance. Well-written, with new details from archival research used for vivid descriptions of key events, “The Russian Revolution” comes nearly three decades after Richard Pipes’s masterpiece of the same name.

. . .

Far from the hopeless backwater depicted in most histories, McMeekin argues, Russia’s economy was surging before the war, with a growth rate of 10 percent a year — like China in the early 21st century. “The salient fact about Russia in 1917,” he writes, “is that it was a country at war,” yet he adds that the Russian military acquitted itself well on the battlefield after terrible setbacks in 1915, with morale high in early 1917. “Knowing how the story of the czars turns out, many historians have suggested that the Russian colossus must always have had feet of clay,” he writes. “But surely this is hindsight. Despite growing pains, uneven economic development and stirrings of revolutionary fervor, imperial Russia in 1900 was a going concern, its very size and power a source of pride to most if not all of the czar’s subjects.”

Nicholas II — rightly characterized as an incompetent reactionary in most histories — is partly rehabilitated here. His fundamental mistake, McMeekin says, was to trust his liberal advisers, who urged him to go to war, then conspired to remove him from power after protests over bread rations led to a military mutiny. Even the royal family’s trusted faith healer Rasputin, the ogre of conventional wisdom, largely gets a pass for sagely advising the czar that war would prompt his downfall.

Although McMeekin agrees the real villains are the ruthless Bolsheviks, he reserves most criticism for the hapless liberals.

. . .

Having taken power, the Bolsheviks turned on the unwitting soldiers and peasants who were among their most fervent supporters, unleashing a violent terror campaign that appropriated land and grain, and that turned into a permanent class war targeting ever-larger categories of “enemies of the people.” Unconcerned about Russia’s ultimate fate, they were pursuing their greater goal of world revolution.

For the full review, see:

Gregory Feifer. “The Best-Laid Plans.” The New York Times Book Review (Sunday, June 11, 2017): 14-15.

(Note: ellipses added.)

(Note: the online version of the review has the date June 6, 2017, and has the title “A New History Recalibrates the Villains of the Russian Revolution.”)

The book under review is:

McMeekin, Sean. The Russian Revolution: A New History. New York: Basic Books, 2017.

“Celebrities Have Access to Better Care Than Ordinary People”

As the passages quoted below suggest, Trump’s friends may have had access to drugs that not everyone had access to. But it also should be acknowledged that Trump was pushing for Covid-19 drugs to be available sooner and with fewer restrictions.

(p. A25) Both the Regeneron and Eli Lilly therapies are meant for people who are at risk of getting sick enough with Covid to be hospitalized, not those who are hospitalized already. The emergency use authorization for the Regeneron treatment specifically says that it is “not authorized” for “adults or pediatric patients who are hospitalized due to Covid-19.”

A physician with experience administering the new monoclonal antibodies, who didn’t want to use his name because he’s not authorized by his hospital to speak publicly, said giving them to Giuliani “appears to be an inappropriate use outside the guidelines of the E.U.A. for a very scarce resource.” Very scarce indeed: According to the Department of Health and Human Services, as of Wednesday the entire country had about 77,000 total doses of the Regeneron cocktail and almost 260,000 doses of Eli Lilly’s monoclonal antibody treatment. That’s less than you’d need to treat everyone who’d tested positive in just the previous two days.

Right now, the criteria for distributing these drugs can be murky. Robert Klitzman, co-founder of the Center for Bioethics at Columbia, said that the federal government allocates doses to states, states allocate them to hospitals and hospitals then decide which patients among those most at risk will get treated. Some states have developed guidelines for monoclonal antibody treatment, “but my understanding is that most states have not yet done that,” Klitzman said.

Hospitals try to come up with ethical triage frameworks, but Klitzman told me there are often workarounds for V.I.P.s. He said it helps to know someone on the hospital’s board. Such bodies typically include wealthy philanthropists. Often, he said, when these millionaires and billionaires ask hospital administrators for special treatment for a friend, “hospitals do it.”

Why? “Hospitals have huge financial problems, especially at the moment with Covid,” he said. They’ve had to shut down profitable elective surgeries and treat many people without insurance. More than ever, he said, they “need money that is given philanthropically from potential donors.”

In other words, Giuliani was right: Celebrities have access to better care than ordinary people. “When someone is in the public eye, or if someone is a potential donor, or has already been a donor to a hospital, then there’s folks in the hospital hierarchy, in the administration, who are keenly aware if they’re coming in, if they’re present, if they need something,” said Shoa Clarke, a cardiologist and professor at Stanford University School of Medicine. Covid, which is leading to rationing of medical resources, only magnifies this longstanding inequality.

For the full commentary, see:

Michelle Goldberg. “Why Trump Cronies Get Covid Meds.” The New York Times (Saturday, December 12, 2020): A25.

(Note: the online version of the commentary has the date Dec. 10, 2020, and has the title “Covid Meds Are Scarce, but Not for Trump Cronies.” The passage quoted above includes several sentences, and a couple of words, that appear in the online, but not in the print, version of the commentary.)

Jobs Told Benioff to Build an “Application Ecosystem”

(p. B1) I first met Steve Jobs in 1984 when Apple Inc. hired me as a summer intern.

. . .

Even once my internship ended, we stayed in touch, and as my career progressed he became a mentor of sorts. Which is why, one memorable day in 2003, I found myself pacing anxiously in the reception area of Apple’s headquarters.

. . .

(p. B2) As Steve’s staff ushered me into Apple’s boardroom that day, I felt a rush of excitement coursing through my jangling nerves.

. . .

“Marc,” he said. “If you want to be a great CEO, be mindful and project the future.”

I nodded, perhaps a bit disappointed. He’d given me similar advice before, but he wasn’t finished.

Steve then told me we needed to land a big account, and to grow “10 times in 24 months or you’ll be dead.” I gulped. Then he said something less alarming, but more puzzling: We needed an “application ecosystem.”

. . .

One evening, over dinner in San Francisco, I was struck by an irresistibly simple idea. What if any developer from anywhere in the world could create their own application for the Salesforce platform? And what if we offered to store these apps in an online directory that allowed any Salesforce user to download them? I wouldn’t say this idea felt entirely comfortable. I’d grown up with the old view of innovation as something that should happen within the four walls of our offices. Opening our products to outside tinkering was akin to giving our intellectual property away. Yet, at that moment, I knew in my gut that if Salesforce was to become the new kind of company I wanted it to be, we would need to seek innovation everywhere.

. . .

Building an ecosystem is about acknowledging that the next game-changing innovation may come from a brilliant technologist and mentor based in Silicon Valley, or it may come from a novice programmer based halfway around the world. A company seeking to achieve true scale needs to seek innovation beyond its own four walls and tap into the entire universe of knowledge and creativity out there.

For the full commentary, see:

Marc Benioff. “What I Learned from Steve Jobs.” The Wall Street Journal (Saturday, October 12, 2019): B1-B2.

(Note: ellipses added.)

(Note: the online version of the commentary has the date October 11, 2019, and has the title “The Lesson I Learned from Steve Jobs.”)

Marc Benioff’s commentary is adapted from his co-authored book:

Benioff, Marc, and Monica Langley. Trailblazer: The Power of Business as the Greatest Platform for Change. New York: Currency, 2019.