Naps Aid Immunity, Energy, Alertness, Memory, and Mood

(p. D4) Sara E. Alger, a sleep scientist at the Walter Reed Army Institute of Research in Silver Spring, Md., has been a public advocate for naps, particularly in the workplace, except in cases of insomnia. Along the way, she has had to fight anti-nap prejudice.

“Naps in general have a stigma attached to them as something you only do when you’re lazy or when you’re sick,” Dr. Alger said.

Wrapped inside nap phobia in the United States is often a message reminding us to be productive during what we now think of as normal working hours, although that concept is relatively new.

Modern attitudes about napping go back to the Industrial Revolution, according to Matthew J. Wolf-Meyer, an anthropologist at Binghamton University in New York and the author of “The Slumbering Masses: Sleep, Medicine, and Modern American Life.”

“For a long time, people had flexible sleep schedules,” Dr. Wolf-Meyer said. Farmers and tradespeople had some autonomy over their time. They could choose to rest in the hottest part of the day, and might take up simple tasks during a wakeful period in the middle of the night, between two distinct bouts of sleep.

As the 1800s went on, more and more Americans worked in factories on set shifts that were supervised by a foreman. “They work for a total stranger, and a nap becomes totally nonnegotiable,” he said.

Staying awake all day and getting one’s sleep in a single long stretch at night came to be seen as normal. With that came a strong societal expectation that we ought to use our daylight hours productively.

. . .

Although there are no hard data so far on whether naps have been on the rise during 2020, sleep scientists like Dr. Alger think it’s likely. The many people who now work remotely no longer need to worry about the disapproving eyes of their colleagues if they want a brief, discreet period of horizontality in the afternoons.

If most offices reopen next year, as now seems possible, perhaps greater tolerance toward the adult nap will be one of the things salvaged from the smoking wreckage of the working-from-home era. (In a tweet last week, Dr. Wolf-Meyer called the pandemic “the largest (accidental) experiment with human #sleep ever conducted.”) . . .

Experts say that people who get seven to nine hours of sleep a day are less prone to catching infectious diseases, and better at fighting off any they do catch. Afternoon sleep counts toward your daily total, according to Dr. Alger.

This immunity boost, she said, is in addition to other well-known dividends of a good nap, like added energy, increased alertness, improved mood and better emotional regulation.

Included under the last rubric is a skill that seems especially useful for dealing with families, even if you never get closer to your relatives this year than a “Hollywood Squares”-style video grid: “Napping helps you be more sensitive to receiving other people’s moods,” Dr. Alger said. “So you’re not perceiving other people as being more negative than they are.”

Napping also helps you remember facts you learned right before nodding off. Given the way things have been going lately, of course, you may not see this as a plus. You could look at it from the reverse angle, though: Every hour before Jan. 1 that you spend napping is another hour of 2020 you won’t remember.

For the full commentary, see:

Pete Wells. “This Thanksgiving, Nap Without Guilt.” The New York Times (Wednesday, November 25, 2020): D1 & D4.

(Note: ellipses added.)

(Note: the online version of the commentary has the date Nov. 24, 2020, and has the title “This Thanksgiving, It’s Time to Stop Nap-Shaming.”)

The book by Wolf-Meyer, mentioned above, is:

Wolf-Meyer, Matthew J. The Slumbering Masses: Sleep, Medicine, and Modern American Life. Minneapolis, MN: University of Minnesota Press, 2012.

Free Speech First Amendment Blocks Government from Punishing False Statements

The commentary quoted below defines “deepfakes” as “apparently real images or videos that show people doing or saying things they never did or said.” For the government to punish false statements, the government would first have to establish which statements are true and which are false. The Supreme Court has ruled that if it did so, the government would be violating free speech, which is protected by the First Amendment of the Constitution. Cass Sunstein, who wrote the commentary below, is a well-respected legal scholar who served as Administrator of the White House Office of Information and Regulatory Affairs in the Obama administration.

(p. C3) Can deepfakes, as such, be prohibited under American law? Almost certainly not. In U.S. v. Alvarez, decided in 2012, a badly divided Supreme Court held that the First Amendment prohibits the government from regulating speech simply because it is a lie.   . . .   The plurality opinion declared that “permitting the government to decree this speech to be a criminal offense…would endorse government authority to compile a list of subjects about which false statements are punishable. That governmental power has no clear limiting principle…. Were this law to be sustained, there could be an endless list of subjects the National Government or the States could single out.”

For the full commentary, see:

Cass R. Sunstein. “Can the Government Regulate Deepfakes?” The Wall Street Journal (Saturday, Jan. 9, 2021): C3.

(Note: the first ellipsis is added; the second and third are in the original.)

(Note: the online version of the commentary has the date January 7, 2021, and has the same title as the print version.)

Cass Sunstein’s commentary is adapted from his book:

Sunstein, Cass R. Liars: Falsehoods and Free Speech in an Age of Deception. New York: Oxford University Press, 2021.

An Octopus “Is a Being With Multiple Selves”

(p. 11) What makes this book shimmer and shine is Godfrey-Smith’s exploration of marine life (drawing on his vast and extensive diving knowledge and field experience) to illuminate the ways in which the animal mind works — and the thoughts and experiences that give it shape.

. . .

Godfrey-Smith has an elegant and exacting way of urging along our curiosity by sharing his own questions about animal cognizance and the ability of some animals, like rats and cuttlefish, to “meander, drift off and dream.” But perhaps the most enthralling part of this book is the author’s experiences diving at famous sites now affectionately called Octopolis and Octlantis, just off the coast of eastern Australia where several octopuses live, hunt, fight and make more octopuses.

It’s an experience that demands we consider the very real possibility that an octopus, an animal already regarded as one of the most complex in the animal kingdom, is a being with multiple selves. A breathtaking explanation follows, and it’s one that makes even a cephalopod fan like me swoon over the myriad possibilities for rethinking the mind as a sort of hidden realm for sentience.

Godfrey-Smith declares, “The world is fuller, more replete with experience than many people have countenanced,” . . .

For the full review, see:

Aimee Nezhukumatathil. “Deep Dive.” The New York Times Book Review (Sunday, December 27, 2020 ): 11.

(Note: ellipses added; italics in original.)

(Note: the online version of the review has the date Nov. 12 [sic], 2020, and has the title “Where Does Our Consciousness Overlap With an Octopus’s?”)

The book under review is:

Godfrey-Smith, Peter. Metazoa: Animal Life and the Birth of the Mind. New York: Farrar, Straus and Giroux, 2020.

Early Animation “Followed Only One Rule”: “Anything Goes”

(p. C5) The story of Disney Studios is a central strand in Mitenbuler’s narrative; Disney became the formidable force that the other animation studios would look toward, compete with and rail against. Max Fleischer, whose studio was responsible for the likes of Popeye and Betty Boop, groused that Disney’s “Snow White,” released in 1937, was “too arty.”  . . .  The wife of one of the Fleischer brothers, though, said they had better watch out: “Disney is doing art, and you guys are still slapping characters on the butt with sticks!”

But what if those slapped butts were part of what had made animation so revolutionary in the first place? Mitenbuler suggests as much, beginning “Wild Minds” with the early days of animation, in the first decades of the 20th century, when the technology of moving pictures was still in its infancy. Like the movie business in general, the field of animation contained few barriers to entry, and a number of Jewish immigrants shut out from other careers found they could make a decent living working for a studio or opening up their own. Even Disney, who grew up in the Midwest, was an outsider without any connections.

The work created in those early decades was often gleefully contemptuous of anything that aspired to good taste. Until the movie studios started self-censoring in the early ’30s, in a bid to avoid government regulation, animators typically followed only one rule to the letter: Anything goes.

For the full review, see:

Jennifer Szalai. “BOOKS OF THE TIMES: Ehh, What’s Animation, Doc?” The New York Times (Thursday, December 17, 2020): C5.

(Note: ellipsis added.)

(Note: the online version of the review has the date Dec. 16, 2020, and has the title “BOOKS OF THE TIMES: ‘Fantasia,’ ‘Snow White,’ Betty Boop, Popeye and the First Golden Age of Animation.”)

The book under review is:

Mitenbuler, Reid. Wild Minds: The Artists and Rivalries That Inspired the Golden Age of Animation. New York: Atlantic Monthly Press, 2020.

Wheaton Economist Seth Norton Reviews Openness to Creative Destruction

Seth Norton wrote a thorough, gracious, and enthusiastic review of my book Openness to Creative Destruction: Sustaining…

Posted by Arthur Diamond on Sunday, February 7, 2021

My book is:

Diamond, Arthur M., Jr. Openness to Creative Destruction: Sustaining Innovative Dynamism. New York: Oxford University Press, 2019.

“Hillbilly Elegy” Book (but Not the Movie) Suggests a “Culture of Poverty”

(p. C3) “Hillbilly Elegy,” published in June of 2016, attracted an extra measure of attention (and controversy) after Donald Trump’s election. It seemed to offer a firsthand report, both personal and analytical, on the condition of the white American working class.

And while the book didn’t really explain the election — Vance is reticent about his family’s voting habits and ideological tendencies — it did venture a hypothesis about how that family and others like it encountered such persistent household dysfunction and economic distress. His answer wasn’t political or economic, but cultural.

He suggests that the same traits that make his people distinctive — suspicion of outsiders, resistance to authority, devotion to kin, eagerness to fight — make it hard for them to thrive in modern American society. Essentially, “Hillbilly Elegy” updates the old “culture of poverty” thesis associated with the anthropologist Oscar Lewis’s research on Mexican peasants (and later with Daniel Patrick Moynihan’s ideas about Black Americans) and applies it to disadvantaged white communities.

Howard and Taylor mostly sidestep this argument, which has been widely criticized. They focus on the characters and their predicaments, and on themes that are likely to be familiar and accessible to a broad range of viewers. The film is a chronicle of addiction entwined with a bootstrapper’s tale — Bev’s story and J.D.’s, with Mamaw as the link between them.

But it sacrifices the intimacy, and the specificity, of those stories by pretending to link them to something bigger without providing a coherent sense of what that something might be. The Vances are presented as a representative family, but what exactly do they represent? A class? A culture? A place? A history? The louder they yell, the less you understand — about them or the world they inhabit.

For the full movie review, see:

A.O. Scott. “I Remember Bev and Mamaw.” The New York Times (Friday, November 27, 2020): C3.

(Note: the online version of the review has the date Nov. 23, 2020, and has the title “‘Hillbilly Elegy’ Review: I Remember Mamaw.”)

J.D. Vance’s book is:

Vance, J. D. Hillbilly Elegy: A Memoir of a Family and Culture in Crisis. New York: HarperCollins Publishers, 2016.

After 19 Rejections in Britain, Walsh Self-Published “Knowledge of Angels”

(p. B11) Jill Paton Walsh was greeted with acclaim in the 1960s when she began writing young-adult books that challenged her readers in both plotting and messaging.

. . .

But in 1994 Ms. Paton Walsh achieved a whole different level of acclaim, by an unlikely route, with a book for adults, “Knowledge of Angels,” a genre-defying medieval fable about an atheist and a girl raised by wolves. Here she delved into themes of faith and reason and more.

Yet despite her success with books for young readers, “Knowledge of Angels” struggled to assert itself: No one in her native England would publish it.

. . .

And so, in a move that was rare for the time, she published it herself — and had the last laugh. The book was shortlisted for the Booker Prize, one of the top literary awards in the world, and is said to be the first self-published book to make that elite list.

Peter Lewis of The Daily Mail had a crisp rebuke for all those publishers — 19 was the final count — who had said no to the book. “To open it and start reading,” he wrote, “is to be appalled by their lack of judgment.”

. . .

. . . when she shopped the ambitious “Knowledge of Angels,” there were no takers in her home country — though Houghton Mifflin had already published the book in the United States. The Guardian would describe it as “a compelling medieval fable centered on the conflict between belief and tolerance, and veined with a complex philosophical argument about the existence of God.”

. . ., Ms. Paton Walsh self-published the book in England, and though it did not win the Booker Prize, its nomination drew considerable attention.

After the nomination, Ms. Paton Walsh chided the British publishers, telling The Times, “They’re all afraid of their jobs, and they make their decisions by committee.”

For the full obituary, see:

Neil Genzlinger. “Jill Paton Walsh, 83, Author Who Scoffed at 19 Rejections.” The New York Times (Monday, November 23, 2020): D7.

(Note: ellipses added.)

(Note: the online version of the obituary was updated Nov. 19, 2020, and has the title “Jill Paton Walsh, Multigenerational Writer, Dies at 83.”)

A later edition of Walsh’s successful self-published book is:

Walsh, Jill Paton. Knowledge of Angels. reprint pb ed. London: Transworld Publishers Ltd., 1998.

Communist Dictatorship Was Not Inevitable in Russia in 1917

(p. 14) A professor at Bard College, McMeekin argues that one of the seminal events of modern history was largely a matter of chance. Well-written, with new details from archival research used for vivid descriptions of key events, “The Russian Revolution” comes nearly three decades after Richard Pipes’s masterpiece of the same name.

. . .

Far from the hopeless backwater depicted in most histories, McMeekin argues, Russia’s economy was surging before the war, with a growth rate of 10 percent a year — like China in the early 21st century. “The salient fact about Russia in 1917,” he writes, “is that it was a country at war,” yet he adds that the Russian military acquitted itself well on the battlefield after terrible setbacks in 1915, with morale high in early 1917. “Knowing how the story of the czars turns out, many historians have suggested that the Russian colossus must always have had feet of clay,” he writes. “But surely this is hindsight. Despite growing pains, uneven economic development and stirrings of revolutionary fervor, imperial Russia in 1900 was a going concern, its very size and power a source of pride to most if not all of the czar’s subjects.”

Nicholas II — rightly characterized as an incompetent reactionary in most histories — is partly rehabilitated here. His fundamental mistake, McMeekin says, was to trust his liberal advisers, who urged him to go to war, then conspired to remove him from power after protests over bread rations led to a military mutiny. Even the royal family’s trusted faith healer Rasputin, the ogre of conventional wisdom, largely gets a pass for sagely advising the czar that war would prompt his downfall.

Although McMeekin agrees the real villains are the ruthless Bolsheviks, he reserves most criticism for the hapless liberals.

. . .

Having taken power, the Bolsheviks turned on the unwitting soldiers and peasants who were among their most fervent supporters, unleashing a violent terror campaign that appropriated land and grain, and that turned into a permanent class war targeting ever-larger categories of “enemies of the people.” Unconcerned about Russia’s ultimate fate, they were pursuing their greater goal of world revolution.

For the full review, see:

Gregory Feifer. “The Best-Laid Plans.” The New York Times Book Review (Sunday, June 11, 2017): 14-15.

(Note: ellipses added.)

(Note: the online version of the review has the date June 6, 2017, and has the title “A New History Recalibrates the Villains of the Russian Revolution.”)

The book under review is:

McMeekin, Sean. The Russian Revolution: A New History. New York: Basic Books, 2017.

Lenin, Not Stalin, Started “Severe Censorship” and “Terror Against Political Enemies”

(p. 15) With all the inevitable attention on the Bolshevik takeover in October 1917, when Lenin and Leon Trotsky seized power from the ill-fated provisional government, the extraordinary events of February and March should not be forgotten. It was then that unexpected riots over lack of food and fuel by thousands of people in the imperial capital of Petrograd and the ensuing mutiny by garrison troops compelled Czar Nicholas II to abdicate, ending 300 years of Romanov rule and handing political authority to a group of high-minded liberal figures. “Russia became the freest country in the world,” Merridale writes, “as the new government granted an amnesty for political prisoners, abolished the death penalty and dissolved what was left of the detested secret police.” (It also abolished the infamous Pale of Settlement, which had required the czar’s Jewish subjects to live within a defined area of the country; they were now made equal before the law.)

The provisional government inherited power from a discredited autocracy that had resisted any sensible move to establish a constitutional monarchy. Leaders like Alexander Kerensky, Paul Miliukov and Georgy Lvov tried in vain to establish a stable government and withstand the appeal of extreme forces. But the Romanov collapse was so sudden and so thorough that it left no credible institutions capable of governing effectively, let alone in the midst of widespread social turmoil, an imploding economy and the devastations of World War I.

. . .

. . . it was Lenin himself who made it clear that the Bolsheviks would reject democratic values. He “had not traveled back to join a coalition,” Merridale writes, but to undermine the provisional government and establish a dictatorship in the name of the proletariat. It was Lenin who instituted severe censorship, established one-party rule and resorted to terror against his political enemies. Stalin took these measures to further extremes for his own sinister purposes. Merridale is right to recall Winston Churchill’s famous observation about Lenin’s return. The Germans, Churchill wrote, “turned upon Russia the most grisly of all weapons. They transported Lenin in a sealed truck like a plague bacillus from Switzerland to Russia.”

For the full review, see:

Joshua Rubenstein. “Fast-Tracking the Revolution.” The New York Times Book Review (Sunday, June 11, 2017): 15.

(Note: ellipses added.)

(Note: the online version of the review has the date June 9, 2017, and has the title “Lenin’s Return From Exile Put Russia on the Fast Track to Revolution.”)

The book under review is:

Merridale, Catherine. Lenin on the Train. New York: Metropolitan Books, 2017.

Jobs Told Benioff to Build an “Application Ecosystem”

(p. B1) I first met Steve Jobs in 1984 when Apple Inc. hired me as a summer intern.

. . .

Even once my internship ended, we stayed in touch, and as my career progressed he became a mentor of sorts. Which is why, one memorable day in 2003, I found myself pacing anxiously in the reception area of Apple’s headquarters.

. . .

(p. B2) As Steve’s staff ushered me into Apple’s boardroom that day, I felt a rush of excitement coursing through my jangling nerves.

. . .

“Marc,” he said. “If you want to be a great CEO, be mindful and project the future.”

I nodded, perhaps a bit disappointed. He’d given me similar advice before, but he wasn’t finished.

Steve then told me we needed to land a big account, and to grow “10 times in 24 months or you’ll be dead.” I gulped. Then he said something less alarming, but more puzzling: We needed an “application ecosystem.”

. . .

One evening, over dinner in San Francisco, I was struck by an irresistibly simple idea. What if any developer from anywhere in the world could create their own application for the Salesforce platform? And what if we offered to store these apps in an online directory that allowed any Salesforce user to download them? I wouldn’t say this idea felt entirely comfortable. I’d grown up with the old view of innovation as something that should happen within the four walls of our offices. Opening our products to outside tinkering was akin to giving our intellectual property away. Yet, at that moment, I knew in my gut that if Salesforce was to become the new kind of company I wanted it to be, we would need to seek innovation everywhere.

. . .

Building an ecosystem is about acknowledging that the next game-changing innovation may come from a brilliant technologist and mentor based in Silicon Valley, or it may come from a novice programmer based halfway around the world. A company seeking to achieve true scale needs to seek innovation beyond its own four walls and tap into the entire universe of knowledge and creativity out there.

For the full commentary, see:

Marc Benioff. “What I Learned from Steve Jobs.” The Wall Street Journal (Saturday, October 12, 2019): B1-B2.

(Note: ellipses added.)

(Note: the online version of the commentary has the date October 11, 2019, and has the title “The Lesson I Learned from Steve Jobs.”)

Marc Benioff’s commentary is adapted from his co-authored book:

Benioff, Marc, and Monica Langley. Trailblazer: The Power of Business as the Greatest Platform for Change. New York: Currency, 2019.

“Exhilaration and Loneliness of Pioneering Thought”

(p. A15) In “The Riddle of the Rosetta: How an English Polymath and a French Polyglot Discovered the Meaning of Egyptian Hieroglyphs,” Jed Z. Buchwald and Diane Greco Josefowicz recount Thomas Young’s and Jean-François Champollion’s competing efforts toward decipherment.

. . .

The authors are chiefly concerned with Young’s and Champollion’s approaches to the hieroglyphic riddle. Rarely have I seen the false starts and blind alleys, firm beliefs and 180-degree recalibrations, exhilaration and loneliness of pioneering thought captured so well. On the other hand, not every reader will match Champollion’s stamina or persevere through the book’s densest thickets. Dramatic touches are few. Champollion probably didn’t, as commonly reported, faint at the moment of his triumph. And Young was no swashbuckler. Indiana Jones hates snakes. Young hated idioms.

If “The Riddle of the Rosetta” won’t be coming to screens anytime soon, its achievement is no less admirable. For nearly 500 pages we are invited to inhabit the minds of two of history’s finest linguists.

For the full review, see:

Maxwell Carter. “BOOKSHELF; Found In Translation.” The Wall Street Journal (Friday, September 18, 2020): A15.

(Note: ellipsis added.)

(Note: the online version of the review has the date Sep. 17, 2020, and has the title “BOOKSHELF; ‘The Riddle of the Rosetta’ Review: Found in Translation.”)

The book under review is:

Buchwald, Jed Z., and Diane Greco Josefowicz. The Riddle of the Rosetta: How an English Polymath and a French Polyglot Discovered the Meaning of Egyptian Hieroglyphs. Princeton, NJ: Princeton University Press, 2020.