Hunter-Gatherers Use Division of Labor

(p. D4) The division of labor in hunter-gatherer communities is complex and sophisticated, and crucial to their economic success, researchers report.
A paper in the journal Philosophical Transactions B looks at two hunter-gatherer groups: the Tsimane game hunters of lowland Bolivia, and the Jenu Kuruba honey collectors of South India.
“In contrast to the simple cave man view of a hunter-gatherer, we found that it requires a tremendous amount of skill, knowledge and training,” said Paul Hooper, an anthropologist at Emory University and one of the study’s authors.
. . .
When Jenu Kuruba men go in search of honey, Dr. Hooper said, “there’s one man who specializes in making smoke to subdue the bees, another that climbs the trees, and others that act as support staff to lower combs.”

For the full story, see:
SINDYA N. BHANOO. “Observatory; Nothing Simple About Hunter-Gatherer Societies.” The New York Times (Tues., OCT. 27, 2015): D4.
(Note: ellipsis added.)
(Note: the online version of the story has the date OCT. 26, 2015.)

The academic article mentioned in the passage quoted above, is:
Hooper, Paul L., Kathryn Demps, Michael Gurven, Drew Gerkey, and Hillard S. Kaplan. “Skills, Division of Labour and Economies of Scale among Amazonian Hunters and South Indian Honey Collectors.” Philosophical Transactions of the Royal Society of London B: Biological Sciences 370, no. 1683 (Oct. 2015), DOI: 10.1098/rstb.2015.0008.

“Racist” Woodrow Wilson Adopted “White Supremacy as Government Policy”

(p. A25) In 1882, soon after graduating from high school, the young John Davis secured a job at the Government Printing Office.

Over a long career, he rose through the ranks from laborer to a position in midlevel management. He supervised an office in which many of his employees were white men. He had a farm in Virginia and a home in Washington. By 1908, he was earning the considerable salary — for an African-American — of $1,400 per year.
But only months after Woodrow Wilson was sworn in as president in 1913, my grandfather was demoted. He was shuttled from department to department in various menial jobs, and eventually became a messenger in the War Department, where he made only $720 a year.
By April 1914, the family farm was auctioned off. John Davis, a self-made black man of achievement and stature in his community at the turn of the 20th century, was, by the end of Wilson’s first term, a broken man. He died in 1928.
Many black men and women suffered similar fates under Wilson. As the historian Eric S. Yellin of the University of Richmond documents in his powerful book “Racism in the Nation’s Service,” my grandfather’s demotion was part of a systematic purge of the federal government; with Wilson’s approval, in a few short years virtually all blacks had been removed from management responsibilities, moved to menial jobs or simply dismissed.
My grandfather died before I was born, but I have learned much about his struggle — and that of other black civil servants in the federal government — from his personnel file.
. . .
Consider a letter he wrote on May 16, 1913, barely a month after his demotion. “The reputation which I have been able to acquire and maintain at considerable sacrifice,” he wrote, “is to me (foolish as it may appear to those in higher stations of life) a source of personal pride, a possession of which I am very jealous and which is possessed at a value in my estimation ranking above the loss of salary — though the last, to a man having a family of small children to rear, is serious enough.”
And the reply he received? His supervisor said, simply, that my grandfather was unable to “properly perform the duties required (he is too slow).” Yet there had never been any indication of this in his personnel file.
Wilson was not just a racist. He believed in white supremacy as government policy, so much so that he reversed decades of racial progress. But we would be wrong to see this as a mere policy change; in doing so, he ruined the lives of countless talented African-Americans and their families.

For the full commentary, see:
GORDON J. DAVIS. “Wilson, Princeton and Race.” The New York Times (Tues., NOV. 24, 2015): A25.
(Note: ellipsis added.)
(Note: the online version of the commentary has the title “What Woodrow Wilson Cost My Grandfather.”)

The Yellin book praised in the passage quoted above, is:
Yellin, Eric S. Racism in the Nation’s Service: Government Workers and the Color Line in Woodrow Wilson’s America. Chapel Hill, NC: The University of North Carolina Press, 2013.

See also:
Patler, Nicholas. Jim Crow and the Wilson Administration: Protesting Federal Segregation in the Early Twentieth Century. Boulder, CO: University Press of Colorado, 2004.

While Woodrow Wilson Was President of Princeton, “No Blacks Were Admitted”

(p. A1) PRINCETON, N.J. — Few figures loom as large in the life of an Ivy League university as Woodrow Wilson does at Princeton.

. . .
But until posters started appearing around campus in September, one aspect of Wilson’s legacy was seldom discussed: his racist views, and the ways he acted on them as president of the United States.
The posters, put up by a year-old student group called the Black Justice League, featured some of Wilson’s more offensive quotes, including his comment to an African-American leader that “segregation is not humiliating, but a benefit, and ought to be so regarded by you,” and led to a remarkable two days at this genteel (p. A17) campus last week.
. . .
Perhaps best known for leading the United States during World War I and for trying to start the League of Nations, Wilson as president rolled back gains blacks had made since Reconstruction, removing black officials from the federal government and overseeing the segregation of rank-and-file workers.
Raised in the South, he wrote of “a great Ku Klux Klan” that rose up to rid whites of “the intolerable burden of governments sustained by the votes of ignorant Negroes.”
During Wilson’s tenure as president of Princeton, no blacks were admitted — “The whole temper and tradition of the place are such that no Negro has ever applied,” he wrote — though Harvard and Yale had admitted blacks decades earlier. Princeton admitted its first black student in the 1940s.

For the full story, see:
ANDY NEWMAN. “At Princeton, Woodrow Wilson, a Heralded Alum, Is Recast as an Intolerant One.” The New York Times (Mon., NOV. 23, 2015): A1 & A17.
(Note: ellipses added.)
(Note: the online version of the story has the date NOV. 22, 2015.)

Bike Helmet Regulations Hurt Health

(p. D1) . . . many cycling advocates have taken a surprising position: They are pushing back against mandatory bike-helmet laws in the U.S. and elsewhere. They say mandatory helmet laws, particularly for adults, make cycling less convenient and seem less safe, thus hindering the larger public-health gains of more people riding bikes.
All-ages helmet laws might actually make cycling more dangerous, some cyclists say, by decreasing ridership. Research shows that the more cyclists there are on the road, the fewer crashes there are. Academics theorize that as drivers become used to seeing bikes on a street, they watch more closely for them.
. . .
Piet de Jong, a professor in the department of applied finance and actuarial studies at Sydney’s Macquarie University, actually calculated the trade-off of mandatory helmet laws. In a 2012 paper in the journal Risk Analysis, he weighed the reduction of head injuries against increased morbidity due to foregone exercise from reduced cycling.
Dr. de Jong concluded that mandatory bike-helmet laws “have a net negative health impact.” That is in part because many people cycle to work or for errands, experts say. People tend to replace that type of cycling not with another physical activity such as a trip to the gym, but with a ride in a car.

For the full story, see:
RACHEL BACHMAN. “The Helmet-Law Backlash.” The Wall Street Journal (Tues., Oct. 13, 2015): D1 & D4.
(Note: ellipses added.)
(Note: the online version of the article was dated Oct. 12, 2015, and has the title “Do Bike Helmet Laws Do More Harm Than Good?”)

Humans Suffered from Plague by at Least 5,000 Years Ago

(p. D4) Historians and microbiologists alike have searched for decades for the origins of plague. Until now, the first clear evidence of Yersinia pestis infection was the Plague of Justinian in the 6th century, which severely weakened the Byzantine Empire.
But in a new study, published on Thursday [Oct. 22, 2015] in the journal Cell, researchers report that the bacterium was infecting people as long as 5,000 years ago.

For the full story, see:
“Archaeology: Plagues Said to Have Hit During Bronze Age.” The New York Times (Tues., OCT. 27, 2015): D4.
(Note: bracketed date added.)
(Note: the much shorter online version of the story has the date OCT. 22 (sic), 2015, and has the title “In Ancient DNA, Evidence of Plague Much Earlier Than Previously Known.” The passage quoted above is from the online version.)

The academic article mentioned in the passages quoted above, is:
Rasmussen, Simon, Morten Erik Allentoft, Kasper Nielsen, Ludovic Orlando, Martin Sikora, Karl-Göran Sjögren, Anders Gorm Pedersen, Mikkel Schubert, Alex Van Dam, Christian Moliin Outzen Kapel, Henrik Bjørn Nielsen, Søren Brunak, Pavel Avetisyan, Andrey Epimakhov, Mikhail Viktorovich Khalyapin, Artak Gnuni, Aivar Kriiska, Irena Lasak, Mait Metspalu, Vyacheslav Moiseyev, Andrei Gromov, Dalia Pokutta, Lehti Saag, Liivi Varul, Levon Yepiskoposyan, Thomas Sicheritz-Pontén, Robert A Foley, Marta Mirazón Lahr, Rasmus Nielsen, Kristian Kristiansen, and Eske Willerslev. “Early Divergent Strains of Yersinia Pestis in Eurasia 5,000 Years Ago.” Cell 163, no. 3 (Oct. 2015): 571-82.

Only 5% of Gender Pay Differential Is Likely Due to Discrimination

(p. A17) Full-time employment is technically defined as more than 35 hours. This raises an obvious problem: A simple side-by-side comparison of all men and all women includes people who work 35 hours a week, and others who work 45. Men are significantly more likely than women to work longer hours, according to the BLS. And if we compare only people who work 40 hours a week, BLS data show that women then earn on average 90 cents for every dollar earned by men.
Career choice is another factor. Research in 2013 by Anthony Carnevale, a Georgetown University economist, shows that women flock to college majors that lead to lower-paying careers. Of the 10 lowest-paying majors–such as “drama and theater arts” and “counseling psychology”–only one, “theology and religious vocations,” is majority male.
Conversely, of the 10 highest-paying majors–including “mathematics and computer science” and “petroleum engineering”–only one, “pharmacy sciences and administration,” is majority female. Eight of the remaining nine are more than 70% male.
Other factors that account for earnings differences include marriage and children, both of which cause many women to leave the workforce for years. June O’Neill, former director of the Congressional Budget Office, concluded in a 2005 study that “there is no gender gap in wages among men and women with similar family roles.”
. . .
Ms. O’Neill and her husband concluded in their 2012 book, “The Declining Importance of Race and Gender in the Labor Market,” that once all these factors are taken into account, very little of the pay differential between men and women is due to actual discrimination, which is “unlikely to account for a differential of more than 5 percent but may not be present at all.”

For the full commentary, see:
SARAH KETTERER. “The ‘Wage Gap’ Myth That Won’t Die; You have to ignore many variables to think women are paid less than men. California is happy to try.” The Wall Street Journal (Thurs., Oct. 1, 2015): A17.
(Note: ellipsis added.)
(Note: the online version of the commentary was updated on Sept. 30, 2015.)

The O’Neill book mentioned above, is:
O’Neill, June E., and Dave M. O’Neill. The Declining Importance of Race and Gender in the Labor Market: The Role of Employment Discrimination Policies. Washington, D.C.: AEI Press, 2012.

Sense of Purpose, Not Greed, Is Reason Multimillionaires Keep Working

(p. 10) I’ve often wondered why the so-called Masters of the Universe, those C.E.O.s with multimillion-dollar monthly paychecks, keep working. Why, once they have earned enough money to live comfortably forever, do they still drag themselves to the office? The easy answer, the one I had always settled on, was greed.
But as I watched the hours slowly drip by in my cubicle, an alternative reason came into view. Without a sense of purpose beyond the rent money, malaise sets in almost immediately. We all need a reason to get up in the morning, preferably one to which we can attach some meaning. It is why people flock to the scene of a natural disaster to rescue and rebuild, why people devote themselves to a cause, no matter how doomed it may be. In the end, it’s the process as much as the reward that nourishes us.

For the full commentary, see:
TED GELTNER. “ON WORK; Bored to Tears by a Do-Nothing Dream Job.” The New York Times, SundayBusiness Section (Sun., NOV. 22, 2015): 10.
(Note: the online version of the commentary was updated on NOV. 21, 2015.)

For Movies, Film Option Survives Digital Advance

(p. B1) Faced with the possible extinction of the material that made Hollywood famous, a coalition of studios is close to a deal to keep Eastman Kodak Co. in the business of producing movie film.
The negotiations–secret until now–are expected to result in an arrangement where studios promise to buy a set quantity of film for the next several years, even though most movies and television shows these days are shot on digital video.
Kodak’s new chief executive, Jeff Clarke, said the pact will allow his company to forestall the closure of its Rochester, N.Y., film manufacturing plant, a move that had been under serious consideration. Kodak’s motion-picture film sales have plummeted 96% since 2006, from 12.4 billion linear feet to an estimated 449 million this year. With the exit of competitor Fujifilm Corp. last year, Kodak is the only major company left producing motion-picture film.
. . .
Film and digital video both “are valid choices, but it would be a tragedy if suddenly directors didn’t have the opportunity to shoot on film,” said Mr. Apatow. director of comedies including “Knocked Up” and “The 40 Year-Old Virgin,” speaking from the New York set of his coming movie “Trainwreck,” which he is shooting on film. “There’s a magic to the grain and the color quality that you get with film.”

For the full story, see:
BEN FRITZ. “Movie Film, at Death’s Door, Gets a Reprieve.” The Wall Street Journal (Weds., July 30, 2014): B1 & B8.
(Note: ellipsis added.)
(Note: the online version of the article was dated July 29, 2014.)

Price Theory Paradox When Gas Prices Fall

(p. A3) When gas prices fall, Americans reliably do two things that don’t make much sense.
They spend more of the windfall on gasoline than they would if the money came from somewhere else.
And they don’t just buy more gasoline. They switch from regular gas to high-octane.
A new report by the JPMorgan Chase Institute, looking at the impact of lower gas prices on consumer spending, finds the same pattern as earlier studies. The average American would have saved about $41 a month last winter by buying the same gallons and grades. Instead, Americans took home roughly $22 a month. People, in other words, used almost half of the windfall to buy more and fancier gas.
. . .
Professors Hastings and Shapiro showed that households adjusted their gas consumption much more sharply in response to changes in gas prices than in response to equivalent changes in overall income. In the fall of 2008, for example, as gas prices fell amid a broad economic collapse, consumers responded as if the decline of gas prices were the more important event, significantly increasing purchases of premium gas.
Moreover, this behavior was prevalent: 61 percent of the households made at least one irrational gas purchase. People “treat changes in gasoline prices as equivalent to very large changes in income when deciding which grade of gasoline to purchase,” they wrote.

For the full commentary, see:
Binyamin Appelbaum. “When Gas Becomes Cheaper, Americans Buy Fancier Gas.” The New York Times (Thurs., OCT. 20, 2015): A3.
(Note: ellipsis added.)
(Note: the online version of the commentary was updated on OCT. 19, 2015, and has the title “When Gas Becomes Cheaper, Americans Buy More Expensive Gas.”)

The Hastings and Shapiro article mentioned above, is:
Hastings, Justine S., and Jesse M. Shapiro. “Fungibility and Consumer Choice: Evidence from Commodity Price Shocks.” Quarterly Journal of Economics 128, no. 4 (Nov. 2013): 1449-98.

What If Steve Jobs Ran the I.C.U.?

We’d like to think that medical intensity and competence in the real world mirror the intensity and competence of television shows like ER and House. But too often it is like the horrible surreal story told below. What if we deregulated medicine to open it to the product and process innovations of intense innovative entrepreneurs like Steve Jobs, Jeff Bezos, and Sam Walton?

(p. 7) Omaha — I’ve been watching the monitor for hours. Natalie’s asleep now and I’m worried about her pulse. It’s edging above 140 beats per minute again and her blood oxygen saturation is becoming dangerously low. I’m convinced that she’s slipping into shock. She needs more fluids. I ring for the nurse.

I know about stuff like septic shock because for more than 20 years I was a transplant surgeon, and some of our patients got incredibly sick after surgery. So when I’m sitting in an I.C.U. in Omaha terrified that Natalie, my 17-year-old daughter, might die, I know what I’m talking about. I tell the nurse that Natalie needs to get another slug of intravenous fluids, and fast.
The nurse says she’ll call the doctor. Fifteen minutes later I find her in the lounge at a computer, and over her shoulder I see a screen full of makeup products. When I ask if we can get that fluid going, I startle her. She says she called the resident and told him the vital signs, but that he thought things were stable.
“He said to hold off for now,” she says.
“Get me two bags of saline. Now,” I tell her.
She says, “I’m calling my supervisor,” and she runs out of the lounge.
. . .
I know I shouldn’t be my daughter’s doctor. They taught us the problems with that during my first week in medical school.
. . .
But right now, I don’t care about any of that. I’m the one with experience taking care of really sick patients, and if I know she needs more fluids, she’s going to get them.
I break into the crash cart, a box on wheels full of stuff they use to resuscitate patients. I pull out two liters of saline solution and run both into Natalie’s IV in less than 20 minutes. Natalie’s pulse slows and her blood pressure rises. An hour later, after the nursing supervisor and on-call resident finally arrive, I’ve finished infusing a third liter. Natalie finally looks better.
This wasn’t the first time during Natalie’s illness eight years ago that I broke my promise to just be her dad. It started a week earlier when she came into the den and showed me the blood she’d coughed up. I suspect a father without my experience might have chalked it up to flu. Maybe because I was a transplant surgeon, and always considered the worst possible cause whenever a patient had a hiccup, I took her to the hospital. I was worried the blood meant she had a bacterial pneumonia, a bad one. And it did.
On the way to the hospital, Natalie took a deep breath and looked at me. “Am I going to die?” she asked. I’m convinced that she would have been dead before morning had I not been a doctor, and one who could recognize septic shock when it affected a normal teenager.

For the full commentary, see:
BUD SHAW. “A Doctor at His Daughter’s Hospital Bed.” The New York Times, SundayReview Section (Sun., SEPT. 6, 2015): 7.
(Note: ellipses added.)
(Note: the online version of the commentary has the date SEPT. 5, 2015.)

The commentary quoted above is adapted from the book:
Shaw, Bud. Last Night in the Or: A Transplant Surgeon’s Odyssey. New York: Plume, 2015.

Professors Oppose Diversity by Discriminating Against Conservatives

(p. A23) One of the great intellectual and moral epiphanies of our time is the realization that human diversity is a blessing. It has become conventional wisdom that being around those unlike ourselves makes us better people — and more productive to boot.
Scholarly studies have piled up showing that race and gender diversity in the workplace can increase creative thinking and improve performance. Meanwhile, excessive homogeneity can lead to stagnation and poor problem-solving.
Unfortunately, new research also shows that academia has itself stopped short in both the understanding and practice of true diversity — the diversity of ideas — and that the problem is taking a toll on the quality and accuracy of scholarly work. This year, a team of scholars from six universities studying ideological diversity in the behavioral sciences published a paper in the journal Behavioral and Brain Sciences that details a shocking level of political groupthink in academia. The authors show that for every politically conservative social psychologist in academia there are about 14 liberal social psychologists.
Why the imbalance? The researchers found evidence of discrimination and hostility within academia toward conservative researchers and their viewpoints. In one survey cited, 79 percent of social psychologists admitted they would be less likely to support hiring a conservative colleague than a liberal scholar with equivalent qualifications.

For the full commentary, see:
Arthur C. Brooks. “Academia’s Rejection of Diversity.” The New York Times (Sat., OCT. 31, 2015): A23.
(Note: the online version of the commentary has the date OCT. 30, 2015.)

The Behavioral and Brain Sciences article mentioned above, is:
Duarte, José L., Jarret T. Crawford, Charlotta Stern, Jonathan Haidt, Lee Jussim, and Philip E. Tetlock. “Political Diversity Will Improve Social Psychological Science.” Behavioral and Brain Sciences 38 (Jan. 2015) DOI: http://dx.doi.org.leo.lib.unomaha.edu/10.1017/S0140525X14000041