FDR’s Taxes Deepened the Great Depression

Professor Ohanian is a UCLA economist well-known for his research on the Great Depression. Below I quote a few of his recent observations (with co-author Cooley):

(p. A17) In 1937, after several years of partial recovery from the Great Depression, the U.S. economy fell into a sharp recession. The episode has become a lightning rod in the ongoing debate about whether the economy needs further increases in government spending to keep employment from declining even more.
. . .
The economy did not tank in 1937 because government spending declined. Increases in tax rates, particularly capital income tax rates, and the expansion of unions, were most likely responsible. Unfortunately, these same factors pose a similar threat today.
. . .
. . . in 1936, the Roosevelt administration pushed through a tax on corporate profits that were not distributed to shareholders. The sliding scale tax began at 7% if a company retained 1% of its net income, and went to 27% if a company retained 70% of net income. This tax significantly raised the cost of investment, as most investment is financed with a corporation’s own retained earnings.
The tax rate on dividends also rose to 15.98% in 1932 from 10.14% in 1929, and then doubled again by 1936. Research conducted last year by Ellen McGratten of the Federal Reserve Bank of Minneapolis suggests that these increases in capital income taxation can account for much of the 26% decline in business fixed investment that occurred in 1937-1938.

For the full commentary, see:
THOMAS F. COOLEY AND LEE E. OHANIAN. “Gates and Buffett Take the Pledge; Wealthy businessmen often feel obligated to ‘give back.’ Who says they’ve taken anything?” The Wall Street Journal (Fri., AUGUST 20, 2010): A15.
(Note: ellipses added.)

That McGratten paper is:
McGrattan, Ellen R. “Capital Taxation During the U.S. Great Depression.” Working Paper 670, Federal Reserve Bank of Minneapolis, April 2009.

Twitter CEO Returned to Nebraska to Found First Company

WilliamsEvanTwitter2010-09-02.jpg

Evan Williams, Twitter CEO. Source of photo: online version of the NYT article quoted and cited below.

(p. 9) I GREW up on a farm in Nebraska, where we grew mostly corn and soybeans. During the summers I was responsible for making sure the crops were irrigated.

After high school, I enrolled at the University of Nebraska at Lincoln, but I stayed only a year and a half. I felt college was a waste of time; I wanted to start working. I moved to Florida, where I did some freelance copywriting. After that I moved to Texas and stayed with my older sister while I figured out what to do next. In 1994, I returned to Nebraska and started my first company with my dad.
We didn’t know anything about the Internet, but I thought it was going to be a big deal. We produced CD-ROMs and a video on how to use the Internet, and we did some Web hosting. I recruited some friends and we tossed around some ideas, but none of us knew how to write software and we didn’t have much money. We watched what entrepreneurs in California were doing and tried to play along.
. . .
My life has been a series of well-orchestrated accidents; I’ve always suffered from hallucinogenic optimism. I was broke for more than 10 years. I remember staying up all night one night at my first company and looking in couch cushions the next morning for some change to buy coffee. I’ve been able to pay my father back, which is nice, and my mother doesn’t worry about me as much since I got married a year and a half ago.

For the full story, see:
EVAN WILLIAMS. “The Boss; For Twitter C.E.O., Well-Orchestrated Accidents.” The New York Times, SundayBusiness Section (Sun., March 8, 2009): 9.
(Note: the online version of the story is dated March 7, 2009.)

Cultures that Excel at the Practical Often Also Excel at the Sublime

According to the reasoning of the following passages, the same Cro-Magnons who created the wonderful cave paintings at Lascaux, were also the ones who created the highly effective laurel leaf projectile points.
It is often believed that the practical is in conflict with the sublime. The Solutreans may be one more example, in addition to that of entrepreneurial capitalism, that cultures that excel at the practical also excel at the sublime.
[The passages I quote are somewhat disjointed, so let me sketch how they fit together. The first sentence asserts that the Lascaux cave paintings are the prehistoric equal of the Sistine Chapel. The second passage describes the Salutreans’ highly practical laurel leaf projectile points. The final sentence asserts that the same Salutrean culture that invented the practical points, also painted the sublime cave at Lascaux.]

(p. 219) Lascaux had been sealed since the late Ice Age, so what the Abbe Henri Breuil soon called “the Sistine Chapel of Prehistory” was intact.
. . .
(p. 221) . . . The seasonal killing at Solutre resumed, but now the prey was reindeer rather than horses. This time, too, the hunters used not only bone-pointed spears hut also weapons bearing what French archaeologists rather elegantly call feuilles de laurier, “laurel leaves” . . . . These beautifully made stone projectile points do indeed look like idealized laurel leaves and stand out as exotic in otherwise unchanging tool kits of bone artifacts, burins, and scrapers. Those skilled enough to fabricate them had mastered a new (p. 222) stoneworking technology, which involved using an antler billet to squeeze off shallow flakes by applying sharp pressure along the edges of a blade. This technique–pressure flaking–produced thin, beautifully shaped yet functional spear points that were both lethal and lovely to look upon. Sometimes, the stoneworkers made what one might call rudimentary versions of the points using pressure flaking on but one side of the tool. On occasion, too, they made spearheads with a shoulder that served as the mount for the shaft. But the ultimate was the classic laurel leaf, flaked on both sides, beautifully regular and thin. Feuilles de laurier were never common, and indeed, some researchers wonder if they were, in fact, ceremonial tools and never used in the field. This seems unlikely, for they would have made tough, effective weapons for killing prey like reindeer.
. . .
If the Lascaux chronology is to be believed–and remember that the radiocarbon dates come from artifacts in the cave, not actual paintings–then Solutreans were the artists who painted there, . . .

Source:
Fagan, Brian. Cro-Magnon: How the Ice Age Gave Birth to the First Modern Humans. New York: Bloomsbury Press, 2010.
(Note: ellipses added; italics in original.)

“A Very Clear-Thinking Heretic” Doubted Big Bang Theory

BurbidgeGeoffrey2010-09-02.jpg “Geoffrey Burbidge’s work in astronomy changed the field.” Source of caption and photo: online version of the NYT obituary quoted and cited below.

(p. 26) Geoffrey Burbidge, an English physicist who became a towering figure in astronomy by helping to explain how people and everything else are made of stardust, died on Jan. 26 in San Diego. He was 84.
. . .
Dr. Burbidge’s skepticism extended to cosmology. In 1990, he and four other astronomers, including Drs. Arp and Hoyle, published a broadside in the journal Nature listing arguments against the Big Bang.
Dr. Burbidge preferred instead a version of Dr. Hoyle’s Steady State theory of an eternal universe. In the new version, small, local big bangs originating in the nuclei of galaxies every 20 billion years or so kept the universe boiling. To his annoyance, most other astronomers ignored this view.
In a memoir in 2007, Dr. Burbidge wrote that this quasi-steady state theory was probably closer to the truth than the Big Bang. But he added that “there is such a heavy bias against any minority point of view in cosmology that it may take a very long time for this to occur.”
Despite his contrarian ways, Dr. Burbidge maintained his credibility in the astronomical establishment, serving as director of Kitt Peak from 1978 to 1984 and editing the prestigious Annual Review of Astronomy and Astrophysics for more than 30 years. He was “a very clear-thinking heretic,” Dr. Strittmatter said.

For the full obituary, see:

DENNIS OVERBYE. “Geoffrey Burbidge, Who Traced Life to Stardust, Is Dead at 84 ” The New York Times, First Section (Sun., February 7, 2010 ): A7.

(Note: ellipsis added.)
(Note: the online version of the obituary is dated February 6, 2010.)

Successful Entrepreneurs Do Not Need to Give Back to Society—They Already Gave at the Office

(p. A15) Successful entrepreneurs-turned-philanthropists typically say they feel a responsibility to “give back” to society. But “giving back” implies they have taken something. What, exactly, have they taken? Yes, they have amassed great sums of wealth. But that wealth is the reward they have earned for investing their time and talent in creating products and services that others value. They haven’t taken from society, but rather enriched us in ways that were previously unimaginable.
. . .
Let’s hope the philanthropy of those who . . . sign the Giving Pledge achieves great things. But let’s not fool ourselves into thinking that businessmen are likely to achieve more by giving their money away than they have by making it in the first place.

For the full commentary, see:
Kimberly O. Dennis. “Gates and Buffett Take the Pledge; Wealthy businessmen often feel obligated to ‘give back.’ Who says they’ve taken anything?” The Wall Street Journal (Fri., AUGUST 20, 2010): A15.
(Note: ellipses added.)

French Utopian Planned Community Goes Up in Flames

VilleneuveGrenobleFranceUtopia2010-09-01.jpg“The planned neighborhood Villeneuve, in Grenoble, has slowly degraded into a poor district before it finally burst into flames three weeks ago, with a mob setting nearly 100 cars on fire.” Source of caption and photo: online version of the NYT article quoted and cited below.

(p. A7) GRENOBLE, France — A utopian dream of a new urban community, built here in the 1970s, had slowly degraded into a poor neighborhood plagued by aimless youths before it finally burst into flames three weeks ago.

After Karim Boudouda, a 27-year-old of North African descent, and some of his friends had robbed a casino, he was killed in an exchange of automatic gunfire with the police. The next night, Villeneuve, a carefully planned neighborhood of Grenoble in eastern France, exploded. A mob set nearly 100 cars on fire, wrecked a tram car and burned an annex of city hall.
. . .
Villeneuve, or “new city,” emerged directly out of the social unrest of the May 1968 student uprising.
People committed to social change, from here as well as from Paris and other cities, came to create a largely self-contained neighborhood of apartment buildings, parks, schools, and health and local services in this city of 160,000 people, at the spectacular juncture of two rivers and three mountain ranges at the foot of the French Alps.

For the full story, see:

STEVEN ERLANGER. “Grenoble Journal; Utopian Dream Becomes Battleground in France.” The New York Times (Mon., August 9, 2010): A7.

(Note: ellipsis added.)
(Note: the online version of the review is dated August 8, 2010.)

Neanderthal “Innovation Was Rare”

(p. 42) Judging from slowly changing styles of stone axes, innovation was rare and technological change almost imperceptible. The rhythm of daily life varied little from one generation to the next, just as the lives of animals followed predictable and familiar paths of migration and dispersal, life and death. Humans were collaborative predators among predators, both hunters and the hunted, effective at survival thanks to their expertise with wooden spears, their stalking ability, and their painfully acquired knowledge of animals and plants. And, over two hundred millennia, they gradually evolved into the Neanderthals, the primordial Europeans encountered by the Cro-Magnons.

Source:
Fagan, Brian. Cro-Magnon: How the Ice Age Gave Birth to the First Modern Humans. New York: Bloomsbury Press, 2010.

Government Import Quotas Increase Price of Sugar in U.S.

SugarPriceGraph2010-09-01.gif

Source of graph: online version of the WSJ article quoted and cited below.

(p. A1) The gap between what Americans and the rest of the world pay for sugar has reached its widest level in at least a decade, breathing new life into the battle over import quotas that prop up the price of the sweet stuff in the U.S.

For years, U.S. prices have been artificially inflated by import restrictions designed to protect American farmers. That has kept the price well above the global market.
But in recent days, the difference between the two has ballooned, giving new impetus to U.S. sugar processors and confectioners to step up their long campaign to pressure the government to increase import limits.
Attention to sugar prices, and the dwindling supply of sugar left in U.S. warehouses, has intensified in the lead up to April 1, after which the U.S. Department of Agriculture can review and change the import quotas, which now stand at 1.3 million metric tons.
Sugar users have long been vocal critics of the quotas but have failed to convince the government to change the limits. The quota has remained unchanged since it was first imposed in 1990, except for two temporary increases after Hurricane Katrina in 2005 and a major refinery explosion in 2008.

For the full story, see:
CAROLYN CUI. “Price Gap Puts Spice in Sugar-Quota Fight.” The Wall Street Journal (Mon., MARCH 15, 2010): A1 & A20.

Christensen’s Innovator’s Dilemma Is “Most Influential Business Book”

(p. W3) . . . in today’s world, gale-like market forces–rapid globalization, accelerating innovation, relentless competition–have intensified what economist Joseph Schumpeter called the forces of “creative destruction.”
. . .
When I asked members of The Wall Street Journal’s CEO Council, a group of chief executives who meet each year to deliberate on issues of public interest, to name the most influential business book they had read, many cited Clayton Christensen’s “The Innovator’s Dilemma.” That book documents how market-leading companies have missed game-changing transformations in industry after industry–computers (mainframes to PCs), telephony (landline to mobile), photography (film to digital), stock markets (floor to online)–not because of “bad” management, but because they followed the dictates of “good” management. They listened closely to their customers. They carefully studied market trends. They allocated capital to the innovations that promised the largest returns. And in the process, they missed disruptive innovations that opened up new customers and markets for lower-margin, blockbuster products.

For the full commentary, see:
ALAN MURRAY. “The End of Management; Corporate bureaucracy is becoming obsolete. Why managers should act like venture capitalists.” The Wall Street Journal (Sat., AUGUST 21, 2010): A17.
(Note: ellipses added.)

The most complete and current account of Christensen’s views can be found in:
Christensen, Clayton M., and Michael E. Raynor. The Innovator’s Solution: Creating and Sustaining Successful Growth. Boston, MA: Harvard Business School Press, 2003.

Harry Frankfurt’s Critique of Postmodernist “Bullshit”

FrankfurtHarry2010-08-29.jpg

“Harry G. Frankfurt.” Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 29) Q: Your new book, “On Truth,” is a sequel to “On Bull–,” a slim philosophical tract published by Princeton University Press that became an accidental best seller last year.
What do you mean by accidental? People didn’t know they were buying it?
. . .

In your new book, you are especially critical of academics and their theories of postmodernism, which treat all truth as an artificial construction as opposed to an independent reality.
I used to teach at Yale, which was at one time a center of postmodernist literary theory. Derrida was there. Paul de Man was there. I originally wrote the bull– essay at Yale, and a physics professor told me that it was appropriate that this essay should have been written at Yale, because, after all, he said, Yale is the bull– capital of the world.
But there is probably far more bull– in politics and entertainment than in academia.
I hope so!
What about in philosophy, which you still teach?
I think there is a certain amount of bull– in philosophy — people pretending to have important ideas when they don’t and obscuring the fact by using a lot of impenetrable language.

For the full interview, see:
DEBORAH SOLOMON. “Questions for Harry G. Frankfurt; Fighting Bull .” The New York Times, Magazine Section (Sun., October 22, 2006): 29.
(Note: ellipsis added; bold in original print version, to indicate questions by Deborah Solomon.)

The reference to the first book is:
Frankfurt, Harry G. On Bullshit. Princeton, NJ: Princeton University Press, 2005.

The reference to the sequel is:
Frankfurt, Harry G. On Truth. New York: Alfred A. Knopf, 2006.

OnBullshitBK2010-08-29.jpgOnTruthBK2010-08-29.jpg

Source of book image on left: http://2.bp.blogspot.com/_T_py15A4TNY/SI9o5-lJ-3I/AAAAAAAAABc/ui9BmdO4Dns/s400/On+Bullshit.jpg

Source of book image on right: http://www.coverbrowser.com/image/bestsellers-2006/509-1.jpg

Compared to the Neanderthals, the Cro-Magnons Had “an Ongoing Culture of Innovation”

In an earlier entry Fagan discusses the eyed needle as key technological advantage of the Cro-Magnons over the Neanderthals. In the passage quoted below, he discusses some other key differences between the two human species.

(p. 14) We know from their art that they looked at their world with more than practical eyes, through a lens of the intangible that changed constantly over the generations. It was this symbolism, these beliefs, as much as their technological innovations and layered clothing, that gave them the decisive advantage over their neighbors in the seesawlike climatic world of the late Ice Age. There were more of them living in larger groups than there were Neanderthals, too, so there were more intense social interactions, much greater food gathering activity from an early age, and an ongoing culture of innovation that came (p. 15) from a growing sophistication of language, advances in technology, and a greater life expectancy. In a world where all knowledge passed orally from one generation to the next, this enhanced cultural buffer between the moderns and the harsh climate provided an extra, albeit sometimes fragile, layer of protection during the intense cold of the so-called Last Glacial Maximum, from 21,500 to 18,000 years ago.

Source:
Fagan, Brian. Cro-Magnon: How the Ice Age Gave Birth to the First Modern Humans. New York: Bloomsbury Press, 2010.