Dogs Feel Guilt When They Hurt Their Humans

(p. 6) Dr. Horowitz concluded that whether dogs take on a guilty look — lowered gaze, ears pressed back, tail rapidly beating between the legs — is unrelated to whether or not they followed orders. If the owner scolds them, they look extremely guilty. If the owner doesn’t, they still sometimes look like this, but less often.

One problem, however, is that our rules are of our own making, such as “Don’t jump on that couch!” or “Keep your nails off my leather chair!” It must be as tough for our pets to grasp these prohibitions as it was for me to understand why I couldn’t chew gum in Singapore.

It would be better to test behavior that is wrong by almost any standard, including that of their own species. The Austrian ethologist Konrad Lorenz gave one of my favorite examples, about his dog, Bully, who broke the fundamental rule never to bite your superior.

Humans don’t need to teach this rule, and indeed Bully had never been punished for it. The dog bit his master’s hand when Dr. Lorenz (p. 7) tried to break up a dogfight. Even though Dr. Lorenz petted him right away, Bully suffered a complete nervous breakdown. For days, he was virtually paralyzed and ignored his food. He would lie on the rug breathing shallowly, occasionally interrupted by a deep sigh. He had violated a natural taboo, which among ancestral canines could have had the worst imaginable consequences, such as expulsion from the pack.

For the full commentary, see:

(Note:  the online version of the commentary has the date March 8, 2019.)

Many Believe Women Should Have Equal Work Opportunity, but Are Better Than Men at Child-Rearing

(p. B1) A new study, based on national survey data from 1977 to 2016, helps explain why the path to equality seems in some ways to have stalled — despite the significant increases in women’s educational and professional opportunities during that period.
Two-thirds of Americans and three-quarters of millennials say they believe that men and women should be equal in both the public sphere of work and the private sphere of home. Only a small share of people, young or old, still say that men and women should be unequal in both spheres — 5 percent of millennials and 7 percent of those born from 1946 to 1980.
But the study revealed that roughly a quarter of people’s views about gender equality are more complicated, and differ regarding work and home. Most of them say that while women should have the same opportunities as men to work or participate in politics, they should do more homemaking and child-rearing, found the study, which is set to be published in the journal Gender and Society.
“You can believe men and women have truly different natural tendencies and skills, that women are better nurturers and caretakers, and still believe women should have equal rights in the labor force,” said Barbara Risman, a sociology professor at the University of Illinois at Chicago and an author of the paper along with William Scarborough, a sociology doctoral candidate there and Ray Sin, a behavioral scientist at Morningstar.

For the full commentary, see:
Miller, Claire Cain. “THE UPSHORT; Equality Valued at Work, Not Necessarily at Home.” The New York Times (Wednesday, Dec. 5, 2018): B1 & B5.
(Note: the online version of the commentary has the date Dec. 3, 2018, and has the title “THE UPSHORT; Americans Value Equality at Work More Than Equality at Home.”)

The academic paper mentioned above, has been published online in advance of print publication:
Scarborough, William J., Ray Sin, and Barbara Risman. “Attitudes and the Stalled Gender Revolution: Egalitarianism, Traditionalism, and Ambivalence from 1977 through 2016.” Gender & Society (2018): https://doi.org/10.1177/0891243218809604

Star Wars Details Allow “a Fully Believable, Escapist Experience”

(p. A15) Mr. Jameson clearly lays out the qualities that geeks appreciate in their art: realism bolstered by a deep internal history and the sort of “world-building” exemplified by Tolkien. But in Hollywood “Star Wars” changed the game thanks to its verisimilitude, “which immediately and thoroughly convinces viewers that they are watching humans and aliens skip from planet to planet in a vast, crowded other galaxy with its own detailed history.” Similarly, the biological background of the “Alien” series includes Xenomorphs “whose intricate life cycle can be described from beginning to end in grisly detail.” Books like “The Star Trek Encyclopedia,” in which the show’s designers document “all the alien planets and species that they’d invented” and present starship engineering schematics, are quintessential works of geek culture.
Detail is important to geeks, the author suggests, because they want without “any boundaries, any limits. . . . They don’t want the artwork to ever end.” Whether it’s playing a tabletop game filled with lore about previously unknown characters from the “Star Wars” galaxy or reading a “textbook” to study the fantastic beasts of the “Harry Potter” world, geeks want to believe–at least for a bit. As Mr. Jameson says, “geeks have long thought of artworks as places where one can hang out.” That’s one reason why single films have given way to trilogies and why characters have cross-populated to create Marvel’s seemingly endless “cinematic universe.”

For the full review, see:
Brian P. Kelly. “BOOKSHELF; The Geeks Strike Back.” The Wall Street Journal (Friday, June 8, 2018): A15.
(Note: ellipsis in original.)
(Note: the online version of the review has the date June 7, 2018, and has the title “BOOKSHELF; ‘I Find Your Lack of Faith Disturbing’ Review: The Geeks Strike Back; The “Star Wars” franchise and Marvel’s superhero films reign supreme in today’s Hollywood. How did that happen?”)

The book under review, is:
Jameson, A. D. I Find Your Lack of Faith Disturbing: Star Wars and the Triumph of Geek Culture. New York: Farrar, Straus and Giroux, 2018.

What Wofford’s Family “Lacked in Money, They Made Up for in Expectations”

(p. A19) Growing up on Buffalo’s rough and often neglected East Side, Keith H. Wofford recalled many crisp autumn Sundays spent with his father bonding over the Bills, following the team’s losses and wins on the radio.
Tickets to football games were not in the family’s budget: His father, John Wofford, worked at the nearby Chevrolet factory for 32 years, and his mother, Ruby, picked up odd jobs in retail to bring in extra income. But what the Woffords lacked in money, they made up for in expectations for their two sons.
“They always had an incredible amount of confidence in us,” Mr. Wofford, 49, said in an interview. “They made very clear that they didn’t see any limitations.”
Mr. Wofford held tight to that ideal as he left high school as a 17-year-old junior to attend Harvard University on a scholarship. Seven years later, he graduated from Harvard Law School. Last year, Mr. Wofford earned at least $4.3 million as a partner overseeing 300 lawyers and 700 employees at the New York office of international law firm Ropes & Gray, LLP, according to financial disclosure forms.
Now he’s the Republican nominee for state attorney general in New York, vying to become one of the most powerful law enforcement officials in the country.
“How many guys who work at a white shoe law firm had dads who had a union job?” asked C. Teo Balbach, 50, the chief executive of a software firm who grew up in Buffalo, and played intramural rugby at Harvard with Mr. Wofford.
“He’s a real hard worker and grinder, and that comes from that upbringing where you come from a middle-class family in a difficult neighborhood and you don’t take anything for granted,” Mr. Balbach added.
. . .
. . . issues facing Mr. Wofford should he win are potential conflicts of interest from his law practice.
. . .
Mr. Wofford said the criticism about him is indicative of Ms. James’s “hyperpartisan” attitude, and he sought to distinguish himself from her by characterizing himself as an outsider.
“Being on the wrong side of the tracks in Buffalo,” Mr. Wofford said, “is about as far from insider as you can get.”
His success as a lawyer, however, did allow him one heartfelt opportunity: In his father’s last years, Mr. Wofford returned to Buffalo, and during football season, they would bond again over Bills games — but in person, at the stadium, as a season-ticket holder.

For the full story, see:
Jeffery C. Mays. “Can an Unknown G.O.P. Candidate Become Attorney General?” The New York Times (Saturday, Oct. 13, 2018): A19.
(Note: ellipses added.)
(Note: the online version of the story has the date Oct. 12, 2018, and has the title “Can a Black Republican Who Voted for Trump Be New York’s Next Attorney General?”)

Buddhist Monks Fear Death

(p. C4) A recent paper in the journal Cognitive Science has an unusual combination of authors. A philosopher, a scholar of Buddhism, a social psychologist and a practicing Tibetan Buddhist tried to find out whether believing in Buddhism really does change how you feel about your self–and about death.
The philosopher Shaun Nichols of the University of Arizona and his fellow authors studied Christian and nonreligious Americans, Hindus and both everyday Tibetan Buddhists and Tibetan Buddhist monks.
. . .
The results were very surprising. Most participants reported about the same degree of fear, whether or not they believed in an afterlife. But the monks said that they were much more afraid of death than any other group did.
Why would this be? The Buddhist scholars themselves say that merely knowing there is no self isn’t enough to get rid of the feeling that the self is there. Neuroscience supports this idea.
. . .
Another factor in explaining why these monks were more afraid of death might be that they were trained to think constantly about mortality. The Buddha, perhaps apocryphally, once said that his followers should think about death with every breath. Maybe just ignoring death is a better strategy.

For the full commentary, see:
Alison Gopnik. “Who’s Most Afraid to Die? A Surprise.” The Wall Street Journal (Saturday, June 9, 2018): C4.
(Note: ellipses added.)
(Note: the online version of the commentary has the date June 6, 2018.)

The print version of the Cognitive Science article discussed above, is:
Nichols, Shaun, Nina Strohminger, Arun Rai, and Jay Garfield. “Death and the Self.” Cognitive Science 42, no. S1 (May 2018): 314-32.

Anthony Bourdain “Let the Locals Shine”

(p. A15) People are mourning celebrity chef Anthony Bourdain all over the world–from Kurdistan to South Africa, from Gaza to Mexico. That may surprise American social-justice warriors who have turned food into a battlefield for what they call “cultural appropriation.”
“When you’re cooking a country’s dish for other people,” an Oberlin College student wrote last year, “you’re also representing the meaning of the dish as well as its culture. So if people not from that heritage take food, modify it and serve it as ‘authentic,’ it is appropriative.” This was prompted by a dining-hall menu that included sushi and banh mi. Celebrity alumna Lena Dunham weighed in on the side of the social-justice warriors.
. . .
Bourdain was a frequent target of similar criticism. When he declared Filipino food the next big thing, a writer for London’s Independent newspaper complained that his “well-meaning” comments were “the latest from a Western (usually white) celebrity chef or food critic to take a once scoffed at cuisine, legitimize it and call it a trend.”
Bourdain took it in stride. Asked on his CNN show, “Anthony Bourdain: Parts Unknown,” what he thought about culinary cultural appropriation, he said: “Look, the story of food is the story of appropriation, of invasion and mixed marriages and war and, you know . . . it constantly changes. You know, what’s authentic anyway?”
. . .
When Bourdain took us to places like Libya and Venezuela and West Virginia, he let the locals shine. His vocation was about more than food. It was about people–understanding their cultures and their lives, lifting them up and making their dishes.

For the full commentary, see:
Elisha Maldonado. “Bourdain vs. the Social-Justice Warriors; The celebrity chef scoffed at the notion of opposing ‘cultural appropriation.'” The Wall Street Journal (Tuesday, June 12, 2018): A15.
(Note: ellipses added.)
(Note: the online version of the commentary has the date June 11, 2018.)

“Books Were Systematically Burned”

(p. 12) Vandalizing the Parthenon temple in Athens has been a tenacious tradition. Most famously, Lord Elgin appropriated the “Elgin marbles” in 1801-5. But that was hardly the first example. In the Byzantine era, when the temple had been turned into a church, two bishops — Marinos and Theodosios — carved their names on its monumental columns. The Ottomans used the Parthenon as a gunpowder magazine, hence its pockmarked masonry — the result of an attack by Venetian forces in the 17th century. Now Catherine Nixey, a classics teacher turned writer and journalist, takes us back to earlier desecrations, the destruction of the premier artworks of antiquity by Christian zealots (from the Greek zelos — ardor, eager rivalry) in what she calls “The Darkening Age.”
. . .
Debate — philosophically and physiologically — makes us human, whereas dogma cauterizes our potential as a species. Through the sharing of new ideas the ancients identified the atom, measured the circumference of the earth, grasped the environmental benefits of vegetarianism.
To be sure, Christians would not have a monopoly on orthodoxy, or indeed on suppression: The history of the ancient world typically makes for stomach-churning reading. Pagan philosophers too who flew in the face of religious consensus risked persecution; Socrates, we must not forget, was condemned to death on a religious charge.
But Christians did fetishize dogma. In A.D. 386 a law was passed declaring that those “who contend about religion … shall pay with their lives and blood.” Books were systematically burned.
. . .
. . . she opens her book with a potent description of black-robed zealots from 16 centuries ago taking iron bars to the beautiful statue of Athena in the sanctuary of Palmyra, located in modern-day Syria. Intellectuals in Antioch (in ancient Syria) were tortured and beheaded, as were the statues around them.
. . .
Nixey closes her book with the description of another Athena, in the city of her name, being decapitated around A.D. 529, her defiled body used as a steppingstone into what was once a world-renowned school of philosophy. Athena was the deity of wisdom. The words “wisdom” and “historian” have a common ancestor, a proto-Indo-European word meaning to see things clearly. Nixey delivers this ballista-bolt of a book with her eyes wide open and in an attempt to bring light as well as heat to the sad story of intellectual monoculture and religious intolerance. Her sympathy, corruscatingly, compellingly, is with the Roman orator Symmachus: “We see the same stars, the sky is shared by all, the same world surrounds us. What does it matter what wisdom a person uses to seek for the truth?”

For the full review, see:
Bettany Hughes. “‘How the Ancient World Was Destroyed.” The New York Times Book Review (Sunday, June 10, 2018): 12.
(Note: ellipses between, and at the start of, paragraphs, added; ellipsis internal to paragraph, in original.)
(Note: the online version of the review has the date June 8, 2018, and has the title “How Christians Destroyed the Ancient World.”)

The book under review, is:
Nixey, Catherine. The Darkening Age: The Christian Destruction of the Classical World. Boston: Houghton Mifflin Harcourt, 2018.