Nonpartisan CBO Estimates $15 Minimum Wage Would Cause 1.4 Million Job Loss

(p. B5) WASHINGTON — Raising the federal minimum wage to $15 an hour — a proposal included in the package of relief measures being pushed by President Biden — would add $54 billion to the budget deficit over the next decade, the Congressional Budget Office concluded on Monday [Feb. 8, 2021].

. . .

Critics of the plan noted a different element of the report: its forecast that raising the minimum wage to $15 would eliminate 1.4 million jobs by the time the increase takes full effect.

“Conservatives have been saying for a while that a recession is absolutely the wrong time to increase the minimum wage, even if it’s slowly phased in,” said Brian Riedl, a senior fellow at the Manhattan Institute. “The economy’s just too fragile.”

For the full story, see:

Jason DeParle. “$15 Minimum Wage Would Cut Poverty And 1.4 Million Jobs.” The New York Times (Tuesday, February 9, 2021): B5.

(Note: ellipsis, and bracketed date, added.)

(Note: the online version of the story has the date Feb. 8, 2021, and has the title “Minimum Wage Hike Would Help Poverty but Cost Jobs, Budget Office Says.”)

The nonpartisan Congressional Budget Office report mentioned above is:

Congressional Budget Office. “The Budgetary Effects of the Raise the Wage Act of 2021.” Feb. 2021.

Walter Williams Wrote That a Minimum Wage “Encourages Racial Discrimination”

(p. 26) Walter E. Williams, a prominent conservative economist, author and political commentator who expressed profoundly skeptical views of government efforts to aid his fellow African-Americans and other minority groups, died on Tuesday [Dec. 1, 2020] on the campus of George Mason University in Virginia, where he had taught for 40 years. He was 84.

His daughter, Devon Williams, said he died suddenly in his car after he had finished teaching a class.

. . .

In the 1970s, during a yearlong stint at the conservative-leaning Hoover Institution on War, Revolution and Peace at Stanford University, Mr. Williams was commissioned by the Joint Economic Committee of Congress to study the ramifications of a minimum wage and of the Davis-Bacon Act, which mandated that laborers in federal construction projects be paid no less than the locally prevailing wages for corresponding work on similar projects in the area.

He outlined his findings in a 1977 report: A minimum wage causes high rates of teenage unemployment, especially among minority workers, and actually “encourages racial discrimination.”

He concluded, he recalled in an interview with The New York Times for this obituary in 2017, that the Davis-Bacon Act had “explicit racist motivations.”

Suppose, he said, that there are 10 secretaries, five of them white and five of them Black — all equally qualified — who are applying for a job. “If by law you must pay them all the same wage,” he said, “it doesn’t cost anything to discriminate against the Black secretaries.” Without such a mandate, he suggested, the Black secretaries would have a better chance at being gainfully employed, even if at lower pay.

In his book “The State Against Blacks” (1982), Mr. Williams was similarly critical of a host of government measures involving labor — from taxicab regulations to occupational licensing — that in his view wound up disproportionately harming Black people in the name of preventing discrimination.

For the full obituary, see:

Robert D. Hershey Jr. “Walter E. Williams, Conservative Economist on Black Issues, Is Dead at 84.” The New York Times, First Section (Sunday, December 6, 2020): 26.

(Note: ellipsis, and bracketed date, added.)

(Note: the online version of the obituary was updated December 7, 2020, and has the title “Walter E. Williams, 84, Dies; Conservative Economist on Black Issues.”)

Williams’s book, mentioned above, is:

Williams, Walter E. The State Against Blacks. New York: McGraw-Hill, 1982.

Evolution of 5G Will Likely Not Favor China

In the passage of the commentary quoted below “RAN equipment” stands for “radio access network equipment” which is key hardware in the latest 5G broadband technology.

(p. C3) Huawei’s first generation of 5G RAN base stations is a modified version of the older 4G infrastructure that yields faster speeds. The ultimate promise of 5G is an ubiquitous network customized to user needs. Trillions of devices and applications—known as the Internet of Things—using 5G technology will offer new solutions for everything from autonomous vehicles to industrial production management to remote surgery. But the drivers of 5G’s evolution will be semiconductors, software systems and cloud computing—areas in which the U.S., not Huawei or any other Chinese company, is the world leader.

Instead of being intimidated by Huawei, U.S. foreign policy makers should recognize the Chinese company’s situation, which is akin to the dominance that IBM enjoyed during the age of mainframe computing. IBM’s massive scale and proprietary standards and software made it hard for competitors to match its offerings. Only in the 1970s and ’80s, when Japan massively subsidized new competitors like NEC, did IBM falter. But the true decline of IBM and its Japanese competitors came with the rise of the internet. The web’s transparent standards enabled many new firms to “plug and play.” Semiconductors, software and desktop computing eventually led to the apps on your smartphone at a fraction of the cost of such functions 30 years ago.

Today, 5G is at a similar moment. A new generation of technological standards for 5G would allow specialist suppliers—like the Microsofts and Intels of the internet era—to compete against Huawei, Ericsson, Nokia and Samsung. Control via the old RAN infrastructure will be diminished by control via cloud computing and software, which plays to a key U.S. strength. Introducing these standards will take concerted action from U.S. firms, along with targeted U.S. government support, such as the adoption of procurement requirements to embody these new rules.

For the full commentary, see:

Peter Cowhey and Susan Shirk. “The Danger of Exaggerating China’s Technological Prowess.” The Wall Street Journal (Saturday, Jan 9, 2021): C3.

(Note: the first ellipsis is added; the second and third are in the original.)

(Note: the online version of the commentary has the date January 8, 2021, and has the same title as the print version.)

The commentary quoted above is related to the report:

Cowhey, Peter, Chair. “Meeting the China Challenge: A New American Strategy for Technology Competition.” San Diego, CA: UC San Diego School of Global Policy and Strategy, Nov. 16, 2020.

Naps Aid Immunity, Energy, Alertness, Memory, and Mood

(p. D4) Sara E. Alger, a sleep scientist at the Walter Reed Army Institute of Research in Silver Spring, Md., has been a public advocate for naps, particularly in the workplace, except in cases of insomnia. Along the way, she has had to fight anti-nap prejudice.

“Naps in general have a stigma attached to them as something you only do when you’re lazy or when you’re sick,” Dr. Alger said.

Wrapped inside nap phobia in the United States is often a message reminding us to be productive during what we now think of as normal working hours, although that concept is relatively new.

Modern attitudes about napping go back to the Industrial Revolution, according to Matthew J. Wolf-Meyer, an anthropologist at Binghamton University in New York and the author of “The Slumbering Masses: Sleep, Medicine, and Modern American Life.”

“For a long time, people had flexible sleep schedules,” Dr. Wolf-Meyer said. Farmers and tradespeople had some autonomy over their time. They could choose to rest in the hottest part of the day, and might take up simple tasks during a wakeful period in the middle of the night, between two distinct bouts of sleep.

As the 1800s went on, more and more Americans worked in factories on set shifts that were supervised by a foreman. “They work for a total stranger, and a nap becomes totally nonnegotiable,” he said.

Staying awake all day and getting one’s sleep in a single long stretch at night came to be seen as normal. With that came a strong societal expectation that we ought to use our daylight hours productively.

. . .

Although there are no hard data so far on whether naps have been on the rise during 2020, sleep scientists like Dr. Alger think it’s likely. The many people who now work remotely no longer need to worry about the disapproving eyes of their colleagues if they want a brief, discreet period of horizontality in the afternoons.

If most offices reopen next year, as now seems possible, perhaps greater tolerance toward the adult nap will be one of the things salvaged from the smoking wreckage of the working-from-home era. (In a tweet last week, Dr. Wolf-Meyer called the pandemic “the largest (accidental) experiment with human #sleep ever conducted.”) . . .

Experts say that people who get seven to nine hours of sleep a day are less prone to catching infectious diseases, and better at fighting off any they do catch. Afternoon sleep counts toward your daily total, according to Dr. Alger.

This immunity boost, she said, is in addition to other well-known dividends of a good nap, like added energy, increased alertness, improved mood and better emotional regulation.

Included under the last rubric is a skill that seems especially useful for dealing with families, even if you never get closer to your relatives this year than a “Hollywood Squares”-style video grid: “Napping helps you be more sensitive to receiving other people’s moods,” Dr. Alger said. “So you’re not perceiving other people as being more negative than they are.”

Napping also helps you remember facts you learned right before nodding off. Given the way things have been going lately, of course, you may not see this as a plus. You could look at it from the reverse angle, though: Every hour before Jan. 1 that you spend napping is another hour of 2020 you won’t remember.

For the full commentary, see:

Pete Wells. “This Thanksgiving, Nap Without Guilt.” The New York Times (Wednesday, November 25, 2020): D1 & D4.

(Note: ellipses added.)

(Note: the online version of the commentary has the date Nov. 24, 2020, and has the title “This Thanksgiving, It’s Time to Stop Nap-Shaming.”)

The book by Wolf-Meyer, mentioned above, is:

Wolf-Meyer, Matthew J. The Slumbering Masses: Sleep, Medicine, and Modern American Life. Minneapolis, MN: University of Minnesota Press, 2012.

New York City’s Resilient Dynamism

(p. C10) Do you worry that New York won’t fully return to what it was before the pandemic?

LEBOWITZ I have lived in New York long enough to know that it will not stay the way it is now. There is not a square foot of New York City, a square foot, that’s the same as it was when I came here in 1970. That’s what a city is, even without a plague. But I’d like to point out, there were many things wrong with it before. After the big protests in SoHo, I saw a reporter interviewing a woman who was a manager of one of the fancy stores there. The reporter said to her, “What are you going to do?” And she said, “There’s nothing we can do until the tourists come back.” I yelled at the TV and I said, “Really? You can’t think what to do with SoHo without tourists? I can! Let me give you some ideas.” Because I remember it without tourists. How about, artists could live there? How about, let’s not have rent that’s $190,000 a month? How about that? Let’s try that.

For the full interview, see:

Dave Itzkoff, interviewer. “More of Her Metropolitan Life.” The New York Times (Friday, January 8, 2021): C1 & C10.

(Note: the online version of the interview has the date Jan. 7, 2020, and has the title “Fran Lebowitz and Martin Scorsese Seek a Missing New York in ‘Pretend It’s a City’.” In the online and print versions the question by Itzkoff, and Lebowitz’s name before her answer, were in bold.)

Abramson “Was Too Busy Surfing” to Patent Wireless Networking

I argue that patents enable funding for poor inventors or inventors who aspire to big expensive breakthroughs. If you have independent means (like a professorship in Hawaii) and mainly aspire to surf, you can afford to ignore patents.

(p. B11) Professor Abramson has been called the father of wireless networking. But it was a shared paternity. The project included graduate students and several faculty members, notably Frank Kuo, a former Bell Labs scientist who came to the University of Hawaii in 1966, the same year Professor Abramson arrived.

His deepest expertise was in communication theory, the subject of his Ph.D. thesis at Stanford University. The fundamental design ideas behind ALOHAnet were his. In a 2018 oral history interview for the Computer History Museum, Professor Kuo recalled, “Norm was the theory and I was the implementer, and so we worked together pretty well.”

. . .

That the ALOHAnet technology became so widely used was partly because Professor Abramson and his team had shared it freely and welcomed other scientists to Hawaii.

“We had done no patenting, and ALOHA was published in scientific papers,” putting their work in the public domain, Professor Abramson said in the oral history, adding: “And that was fine with me. I was too busy surfing to worry about that sort of thing.”

. . .

Some of the data-networking techniques developed by Professor Abramson and his Hawaii team proved valuable not only in wireless communications but also in wired networks. One heir to his work was Robert Metcalfe, who in 1973 was a young computer scientist working at Xerox PARC, a Silicon Valley research laboratory that had become a fount of personal computer innovations.

Mr. Metcalfe was working on how to enable personal computers to share data over wired office networks. He had read a 1970 paper, written by Professor Abramson, describing ALOHAnet’s method for transmitting and resending data over a network.

“Norm kindly invited me to spend a month with him at the University of Hawaii to study ALOHAnet,” Mr. Metcalfe recalled in an email.

Mr. Metcalfe and his colleagues at Xerox PARC adopted and tweaked the ALOHAnet technology in creating Ethernet office networking. Later, Mr. Metcalfe founded an Ethernet company, 3Com, which thrived as the personal computer industry grew.

“Norm, thank you,” Mr. Metcalfe concluded in his email. “Aloha!”

For the full obituary, see:

Steve Lohr. “Norman Abramson, a Pioneer Behind Wireless Networking, Is Dead at 88.” The New York Times (Saturday, December 12, 2020): B11.

(Note: ellipses added.)

(Note: the online version of the obituary has the date Dec. 11, 2020, and has the title “Norman Abramson, Pioneer Behind Wireless Networks, Dies at 88.”)

Early Animation “Followed Only One Rule”: “Anything Goes”

(p. C5) The story of Disney Studios is a central strand in Mitenbuler’s narrative; Disney became the formidable force that the other animation studios would look toward, compete with and rail against. Max Fleischer, whose studio was responsible for the likes of Popeye and Betty Boop, groused that Disney’s “Snow White,” released in 1937, was “too arty.”  . . .  The wife of one of the Fleischer brothers, though, said they had better watch out: “Disney is doing art, and you guys are still slapping characters on the butt with sticks!”

But what if those slapped butts were part of what had made animation so revolutionary in the first place? Mitenbuler suggests as much, beginning “Wild Minds” with the early days of animation, in the first decades of the 20th century, when the technology of moving pictures was still in its infancy. Like the movie business in general, the field of animation contained few barriers to entry, and a number of Jewish immigrants shut out from other careers found they could make a decent living working for a studio or opening up their own. Even Disney, who grew up in the Midwest, was an outsider without any connections.

The work created in those early decades was often gleefully contemptuous of anything that aspired to good taste. Until the movie studios started self-censoring in the early ’30s, in a bid to avoid government regulation, animators typically followed only one rule to the letter: Anything goes.

For the full review, see:

Jennifer Szalai. “BOOKS OF THE TIMES: Ehh, What’s Animation, Doc?” The New York Times (Thursday, December 17, 2020): C5.

(Note: ellipsis added.)

(Note: the online version of the review has the date Dec. 16, 2020, and has the title “BOOKS OF THE TIMES: ‘Fantasia,’ ‘Snow White,’ Betty Boop, Popeye and the First Golden Age of Animation.”)

The book under review is:

Mitenbuler, Reid. Wild Minds: The Artists and Rivalries That Inspired the Golden Age of Animation. New York: Atlantic Monthly Press, 2020.

Communist Dictator Kim Jong-un Is “Really Sorry” for North Koreans’ Economic Suffering

(p. A10) “Our five-year economic development plan has fallen greatly short of its goals in almost all sectors,” Mr. Kim said in his opening speech to the ruling Workers’ Party’s eighth congress that began in Pyongyang, the capital, on Tuesday [Jan. 5, 2021].

. . .

. . . he had little to show on the economic front. He apologized to his people for failing to live up to their expectations. “I am really sorry for that,” he said, appearing to fight back tears. “My efforts and sincerity have not been sufficient enough to rid our people of the difficulties in their life.”

For the full story, see:

Choe Sang-Hun. “Kim Admits That He’s Failed In His 5-Year-Plan to Rebuild North Korea’s Feeble Economy.” The New York Times (Thursday, January 7, 2021): A10.

(Note: ellipses, and bracketed date, added.)

(Note: the online version of the story was updated Jan. 15, 2021, and has the title “North Korea Party Congress Opens With Kim Jong-un Admitting Failures.”)

Wheaton Economist Seth Norton Reviews Openness to Creative Destruction

Seth Norton wrote a thorough, gracious, and enthusiastic review of my book Openness to Creative Destruction: Sustaining…

Posted by Arthur Diamond on Sunday, February 7, 2021

My book is:

Diamond, Arthur M., Jr. Openness to Creative Destruction: Sustaining Innovative Dynamism. New York: Oxford University Press, 2019.

“Increasing Minimum Wages Can Cause Some Job Loss”

(p. B6) Increasing the minimum wage could lead employers to lay off some workers in order to pay others more, said David Neumark, an economics professor at the University of California, Irvine.

“There’s a ton of research that says increasing minimum wages can cause some job loss,” he said. “Plenty workers are helped, but some are hurt.”

A 2019 Congressional Budget Office study found that a $15 federal minimum wage would increase pay for 17 million workers who earned less than that and potentially another 10 million workers who earned slightly more. According to the study’s median estimate, it would cause 1.3 million other workers to lose their jobs.

For the full story, see:

Gillian Friedman. “Base Wage Of $15 Gains In Popularity Across U.S.” The New York Times (Friday, January 1, 2021): B1 & B6.

(Note: the online version of the story has the date Dec. 31, 2020, and has the title “Once a Fringe Idea, the $15 Minimum Wage Is Making Big Gains.”)

“Hillbilly Elegy” Book (but Not the Movie) Suggests a “Culture of Poverty”

(p. C3) “Hillbilly Elegy,” published in June of 2016, attracted an extra measure of attention (and controversy) after Donald Trump’s election. It seemed to offer a firsthand report, both personal and analytical, on the condition of the white American working class.

And while the book didn’t really explain the election — Vance is reticent about his family’s voting habits and ideological tendencies — it did venture a hypothesis about how that family and others like it encountered such persistent household dysfunction and economic distress. His answer wasn’t political or economic, but cultural.

He suggests that the same traits that make his people distinctive — suspicion of outsiders, resistance to authority, devotion to kin, eagerness to fight — make it hard for them to thrive in modern American society. Essentially, “Hillbilly Elegy” updates the old “culture of poverty” thesis associated with the anthropologist Oscar Lewis’s research on Mexican peasants (and later with Daniel Patrick Moynihan’s ideas about Black Americans) and applies it to disadvantaged white communities.

Howard and Taylor mostly sidestep this argument, which has been widely criticized. They focus on the characters and their predicaments, and on themes that are likely to be familiar and accessible to a broad range of viewers. The film is a chronicle of addiction entwined with a bootstrapper’s tale — Bev’s story and J.D.’s, with Mamaw as the link between them.

But it sacrifices the intimacy, and the specificity, of those stories by pretending to link them to something bigger without providing a coherent sense of what that something might be. The Vances are presented as a representative family, but what exactly do they represent? A class? A culture? A place? A history? The louder they yell, the less you understand — about them or the world they inhabit.

For the full movie review, see:

A.O. Scott. “I Remember Bev and Mamaw.” The New York Times (Friday, November 27, 2020): C3.

(Note: the online version of the review has the date Nov. 23, 2020, and has the title “‘Hillbilly Elegy’ Review: I Remember Mamaw.”)

J.D. Vance’s book is:

Vance, J. D. Hillbilly Elegy: A Memoir of a Family and Culture in Crisis. New York: HarperCollins Publishers, 2016.