Government Regulatory Costs Impede Energy Innovation

MetcalfeRobert_National_Medal_of_Technology.jpg

Robert Metcalfe receiving the National Medal of Technology in 2003. Source of photo: http://en.wikipedia.org/wiki/Robert_Metcalfe

The author of the commentary quoted below is famous in the history of information technology. His Harvard dissertation draft on packet switching was rejected as unrealistic. So he left the academy and became the main innovator responsible for making packet switching a reality, through the ethernet.
(He is also the “Metcalfe” behind “Metcalfe’s Law” about the value of a network increasing at a faster rate than the increase in the network’s size.)

(p. A15) . . . new small reactors meet important criteria for nuclear power plants. With no control rods to jam, they are far safer than the old models — you might well call them nuclear batteries. By not using weapons-grade enriched fuels, they are nonproliferating. They minimize nuclear waste. And they’re economical.
. . .
As venture capitalists, we at Polaris might have invested in one or two of these fission-energy start-ups. Alas, we had to pass. The problem with their business plans weren’t their designs, but the high costs and astronomical risks of designing nuclear reactors for certification in Washington.
The start-ups estimate that it will cost each of them roughly $100 million and five years to get their small reactor designs certified by the Nuclear Regulatory Commission. About $50 million of each $100 million would go to the commission itself. That’s a lot of risk capital for any venture-backed start-up, especially considering that not one new commercial nuclear reactor design has been approved and built in the United States for 30 years.
. . .

As we learned by building the Internet, fiercely competitive teams of research professors, graduate students, engineers, entrepreneurs and venture capitalists are the best drivers of technological innovation — not big corporations, and certainly not government bureaucracies. So, if it’s cheap and clean energy we want, we should clear the way for fission energy start-ups. We should lower the barriers at the Nuclear Regulatory Commission for the approval of new nuclear reactors, especially the new small ones. In particular, we should drop the requirement that the commission be reimbursed for reconsidering new fission reactor designs.

For the full commentary, see:
BOB METCALFE. “The New Nuclear Revolution; Safe fission power is our future — if regulators allow it..” Wall Street Journal (Weds., JUNE 24, 2009): A15.
(Note: ellipses added.)

Free-Range Pork Carries More Disease

(p. A19) Is free-range pork better and safer to eat than conventional pork? Many consumers think so. The well-publicized horrors of intensive pig farming have fostered the widespread assumption that, as one purveyor of free-range meats put it, “the health benefits are indisputable.” However, as yet another reminder that culinary wisdom is never conventional, scientists have found that free-range pork can be more likely than caged pork to carry dangerous bacteria and parasites. It’s not only pistachios and 50-pound tubs of peanut paste that have been infected with salmonella but also 500-pound pigs allowed to root and to roam pastures happily before butting heads with a bolt gun.

The study published in the journal Foodborne Pathogens and Disease that brought these findings to light last year sampled more than 600 pigs in North Carolina, Ohio and Wisconsin. The study, financed by the National Pork Board, discovered not only higher rates of salmonella in free-range pigs (54 percent versus 39 percent) but also greater levels of the pathogen toxoplasma (6.8 percent versus 1.1 percent) and, most alarming, two free-range pigs that carried the parasite trichina (as opposed to zero for confined pigs). For many years, the pork industry has been assuring cooks that a little pink in the pork is fine. Trichinosis, which can be deadly, was assumed to be history.
. . .
Let’s not forget that animal domestication has not been only about profit. It’s also been about making meat more reliably available, safer to eat and consistently flavored. The critique of conventional animal farming that pervades food discussions today is right on the mark. But it should acknowledge that raising animals indoors, fighting their diseases with medicine and feeding them a carefully monitored diet have long been basic tenets of animal husbandry that allowed a lot more people to eat a lot more pork without getting sick.

For the full commentary, see:
JAMES E. McWILLIAMS. “Free-Range Trichinosis.” The New York Times (Fri., April 10, 2009): A19.
(Note: ellipsis added.)

The Epistemological Implications of Wikipedia

WikipediaRevolutionBK.jpg

Source of book image: online version of the WSJ review quoted and cited below.

I think the crucial feature of Wikipedia is in its being quick (what “wiki” means in Hawaiian), rather than in its current open source model. Academic knowledge arises in a slow, vetted process. Publication depends on refereeing and revision. On Wikipedia (and the web more generally) knowledge is posted first, and corrected later.
In the actual fact, Wikipedia’s coverage is vast, and its accuracy is high.
I speculate that Wikipedia provides clues to developing new, faster, more efficient knowledge generating institutions.
(Chris Anderson has a nice discussion of Wikipedia in The Long Tail, starting on p. 65.)

(p. A13) Until just a couple of years ago, the largest reference work ever published was something called the Yongle Encyclopedia. A vast project consisting of thousands of volumes, it brought together the knowledge of some 2,000 scholars and was published, in China, in 1408. Roughly 600 years later, Wikipedia surpassed its size and scope with fewer than 25 employees and no official editor.

In “The Wikipedia Revolution,” Andrew Lih, a new-media academic and former Wikipedia insider, tells the story of how a free, Web-based encyclopedia — edited by its user base and overseen by a small group of dedicated volunteers — came to be so large and so popular, to the point of overshadowing the Encyclopedia Britannica and many other classic reference works. As Mr. Lih makes clear, it wasn’t Wikipedia that finished off print encyclopedias; it was the proliferation of the personal computer itself.
. . .
By 2000, both Britannica and Microsoft had subscription-based online encyclopedias. But by then Jimmy Wales, a former options trader in Chicago, was already at work on what he called “Nupedia” — an “open source, collaborative encyclopedia, using volunteers on the Internet.” Mr. Wales hoped that his project, without subscribers, would generate its revenue by selling advertising. Nupedia was not an immediate success. What turned it around was its conversion from a conventionally edited document into a wiki (Hawaiian for “fast”) — that is, a site that allowed anyone browsing it to edit its pages or contribute to its content. Wikipedia was born.
The site grew quickly. By 2003, according to Mr. Lih, “the English edition had more than 100,000 articles, putting it on par with commercial online encyclopedias. It was clear Wikipedia had joined the big leagues.” Plans to sell advertising, though, fell through: The user community — Wikipedia’s core constituency — objected to the whole idea of the site being used for commercial purposes. Thus Wikipedia came to be run as a not-for-profit foundation, funded through donations.
. . .
It is clear by the end of “The Wikipedia Revolution” that the site, for all its faults, stands as an extraordinary demonstration of the power of the open-source content model and of the supremacy of search traffic. Mr. Lih observes that when “dominant encyclopedias” were still hiding behind “paid fire walls” — and some still are — Wikipedia was freely available and thus easily crawled by search engines. Not surprisingly, more than half of Wikipedia’s traffic comes from Google.

For the full review, see:
JEREMY PHILIPS. “Business Bookshelf; Everybody Knows Everything.” Wall Street Journal (Weds., March 18, 2009): A13.
(Note: ellipses added.)

The book being reviewed, is:
Lih, Andrew. The Wikipedia Revolution: How a Bunch of Nobodies Created the World’s Greatest Encyclopedia. New York: Hyperion, 2009.

“Nothing Will Ever Be Attempted if All Possible Objections Must Be First Overcome”

(p. 23) Mr. J. R. Simplot had entered the food processing business, without any clear notion of how to produce dried onion powder or flakes. Once again he followed his lifelong precept of entrepreneurship: “When the time is right, you got to do it.” His rationale is written more elegantly in metal on a small plaque that has stood on Simplot’s desk–and has greeted him each time he pulls up his chair–for some twenty-five years: Nothing will ever (p. 24) be attempted if all possible objections must be first overcome. The objections to signing a contract for delivery of 500,000 pounds of dried, powdered, or flaked onions–without drier, pulverizer, or flaker, or any clue of how to build them–seemed altogether prohibitive. But J. R. Simplot struck when the time was right.

Source:
Gilder, George. Recapturing the Spirit of Enterprise: Updated for the 1990s. updated ed. New York: ICS Press, 1992.
(Note: ellipsis added.)

Increase in Prizes to Advance Innovation

SciencePrizes2009-06-20.jpgSource of graphic on past prizes: online version of the WSJ article quoted and cited below.

(p. A9) Are we impatient with NASA? Google offers $30 million in prizes for a better lunar lander. Do we like solving practical puzzles? InnoCentive Inc. has posted hundreds of lucrative research contests, offering cash prizes up to $1 million for problems in industrial chemistry, remote sensing, plant genetics and dozens of other technical disciplines. Perhaps we crave guilt-free fried chicken. The People for the Ethical Treatment of Animals offers a $1 million prize for the first to create test-tube poultry tissue that can be safely served for dinner.

Call it crowd-sourcing; call it open innovation; call it behavioral economics and applied psychology; it’s a prescription for progress that is transforming philanthropy. In fields from manned spaceflight to the genetics of aging, prizes may soon rival traditional research grants as a spur to innovation. “We see a renaissance in the use of prizes to solve problems,” says Tony Goland, a partner at McKinsey & Co. which recently analyzed trends in prize philanthropy.
. . .
Since 2000, private foundations and corporations have launched more than 60 major prizes, totaling $250 million in new award money, most of it focused on science, medicine, environment and technology, the McKinsey study found.
. . .
In growing numbers, corporate sponsors are embracing the prize challenge as a safe, inexpensive way to farm out product research, at a time when tight credit and business cutbacks have slowed innovation. Venture-capital investments have dropped by almost half since last year, reaching the lowest level since 1997, the National Venture Capital Association recently reported. “Here is a mechanism for off-balance-sheet risk-taking,” says Peter Diamandis, founder of the X Prize Foundation. “A corporation can put up a prize that is bold and audacious with very little downside. You only pay the winner. It is a fixed-price innovation.”

For the full article, see:
ROBERT LEE HOTZ. “SCIENCE JOURNAL; The Science Prize: Innovation or Stealth Advertising? Rewards for Advancing Knowledge Have Blossomed Recently, but Some Say They Don’t Help Solve Big Problems.” Wall Street Journal (Tues., May 8, 2009): A9.
(Note: ellipses added.)

The McKinsey study mentioned in the quotes above, was funded by the Templeton Foundation, and can be downloaded from:

McKinsey&Company. “”And the Winner Is …” Capturing the Promise of Philanthropic Prizes.” McKinsey & Company, 2009.

(Note: ellipsis in study title is in the original.)

The Conflict Between Science and Faith

Professor Krauss is a physicist at Arizona State University.

(p. A15) My practice as a scientist is atheistic. That is to say, when I set up an experiment I assume that no god, angel or devil is going to interfere with its course; and this assumption has been justified by such success as I have achieved in my professional career. I should therefore be intellectually dishonest if I were not also atheistic in the affairs of the world.

— J.B.S. Haldane

J.B.S. Haldane, an evolutionary biologist and a founder of population genetics, understood that science is by necessity an atheistic discipline. As Haldane so aptly described it, one cannot proceed with the process of scientific discovery if one assumes a “god, angel, or devil” will interfere with one’s experiments. God is, of necessity, irrelevant in science.
Faced with the remarkable success of science to explain the workings of the physical world, many, indeed probably most, scientists understandably react as Haldane did. Namely, they extrapolate the atheism of science to a more general atheism.
While such a leap may not be unimpeachable it is certainly rational, as Mr. McGinn pointed out at the World Science Festival. Though the scientific process may be compatible with the vague idea of some relaxed deity who merely established the universe and let it proceed from there, it is in fact rationally incompatible with the detailed tenets of most of the world’s organized religions. As Sam Harris recently wrote in a letter responding to the Nature editorial that called him an “atheist absolutist,” a “reconciliation between science and Christianity would mean squaring physics, chemistry, biology, and a basic understanding of probabilistic reasoning with a raft of patently ridiculous, Iron Age convictions.”

For the full commentary, see:

LAWRENCE M. KRAUSS. “OPINION: God and Science Don’t Mix; A scientist can be a believer. But professionally, at least, he can’t act like one.” The Wall Street Journal (Fri., JUNE 26, 2009): A15.

(Note: italics in original.)

Foreign Aid to Africa “Underwrites Brutal and Corrupt Regimes”

DeadAimBK.jpg

Source of book image: online version of the WSJ review quoted and cited below.

(p. A13) It is one of the great conundrums of the modern age: More than 300 million people living across the continent of Africa are still mired in poverty after decades of effort — by the World Bank, foreign governments and charitable organizations — to lift them out if it. While a few African countries have achieved notable rates of economic growth in recent years, per-capita income in Africa as a whole has inched up only slightly since 1960. In that year, the region’s gross domestic product was about equal to that of East Asia. By 2005, East Asia’s GDP was five times higher. The total aid package to Africa, over the past 50 years, exceeds $1 trillion. There is far too little to show for it.

Dambisa Moyo, a native of Zambia and a former World Bank consultant, believes that it is time to end the charade — to stop proceeding as if foreign aid does the good that it is supposed to do. The problem, she says in “Dead Aid,” is not that foreign money is poorly spent (though much of it is) or that development programs are badly managed (though many of them are). No, the problem is more fundamental: Aid, she writes, is “no longer part of the potential solution, it’s part of the problem — in fact, aid is the problem.”
In a tightly argued brief, Ms. Moyo spells out how attempts to help Africa actually hurt it. The aid money pouring into Africa, she says, underwrites brutal and corrupt regimes; it stifles investment; and it leads to higher rates of poverty — all of which, in turn, creates a demand for yet more aid. Africa, Ms. Moyo notes, seems hopelessly trapped in this spiral, and she wants to see it break free. Over the past 30 years, she says, the most aid-dependent countries in Africa have experienced economic contraction averaging 0.2% a year.

For the full review, see:

MATTHEW REES. “Bookshelf; When Help Does Harm.” Wall Street Journal (Tues., Mach 17, 2009): A13.

The reference to the book under review, is:
Moyo, Dambisa. Dead Aid: Why Aid Is Not Working and How There Is a Better Way for Africa. New York: Farrar, Straus and Giroux, 2009.

Durant and Studebaker Made Transition from Carriage to Car

Christensen’s theory of disruptive innovation predicts that incumbents will seldom survive a major disruption. So it is interesting that Durant and Studebaker, appear to have been exceptions, since they made the transition from producing carriages to producing cars. (Willie Durant founded General Motors in 1908.)

(p. 189) In 1900, fifty-seven surviving American automobile firms, out of hundreds of contenders, produced some 4,000 cars, three-quarters of which ran on steam or electricity. Companies famous for other products were entering the fray. Among them were the makers of the Pope bicycle, the Pierce birdcage, the Peerless wringer, the Buick bathtub, the White sewing machine, and the Briscoe garbage can. All vied for the market with stationary-engine makers, machine-tool manufacturers, and spinoffs of leading carriage firms, Durant and Studebaker. Among the less promising entrants seemed a lanky young engineer from Edison Illuminating Company named Henry Ford, whose Detroit Automobile Company produced twenty-five cars and failed in 1900.

. . .
(p. 191) Willie Durant, who knew all about production and selling from his carriage business, decided it was time to move into cars after several months of driving a prototype containing David Buick’s valve-in-head engine–the most powerful in the world for its size–through rural Michigan in 1904. Within four years, Durant was to parlay his sturdy Buick vehicle into domination of the automobile industry, with a 25 percent share of the market in 1908, the year he founded General Motors.

Source:
Gilder, George. Recapturing the Spirit of Enterprise: Updated for the 1990s. updated ed. New York: ICS Press, 1992.
(Note: ellipsis added.)

Christensen’s theory is most fully expressed in:
Christensen, Clayton M., and Michael E. Raynor. The Innovator’s Solution: Creating and Sustaining Successful Growth. Boston, MA: Harvard Business School Press, 2003.

Individual Independent “Biohackers” Hope to Advance Science

ClosetLaboratory2009-06-20.jpg

“Katherine’s Aull’s closet laboratory in her apartment.” Source of photo and caption: online version of the WSJ article quoted and cited below.

The individual independent scientist used to play an important role in the advance of science, but over time mainly disappeared as the academic scientist, supported by large institutions, became dominant. The dominance of funding from incumbent institutions may constrain major innovations, and so I have speculated that it might be beneficial to find ways for it again to be possible for independent individual scholars to play important roles in science.
Astronomy is one area in which this still happens. The article quoted below points to another domain in which individual scholars might be able to make contributions.

(p. A1) In Massachusetts, a young woman makes genetically modified E. coli in a closet she converted into a home lab. A part-time DJ in Berkeley, Calif., works in his attic to cultivate viruses extracted from sewage. In Seattle, a grad-school dropout wants to breed algae in a personal biology lab.

These hobbyists represent a growing strain of geekdom known as biohacking, in which do-it-yourselfers tinker with the building blocks of life in the comfort of their own homes. Some of them buy DNA online, then fiddle with it in hopes of curing diseases or finding new biofuels.
. . .
Ms. Aull, 23 years old, is designing a customized E. coli in the closet of her Cambridge, Mass., apartment, hoping to help with cancer research.
She’s got a DNA “thermocycler” bought on eBay for $59, and an incubator made by combining a styrofoam box with a heating device meant for an iguana cage. A few months ago, she talked about her hobby on DIY Bio, a Web site frequented by biohackers, and her work was noted in New Scientist magazine.
. . .
(p. A14) Phil Holtzman, a college student and part-time DJ at dance parties in Berkeley, Calif., is growing viruses in his attic that he thinks could be useful in medicine someday. Using pipettes and other equipment borrowed from his community college, he extracts viruses called bacteriophage from sewage and grows them in petri dishes. Mr. Holtzman’s goal: Breed them to survive the high temperatures of the human body, where he thinks they might be useful in killing bad bacteria.
He collects partly treated sewage water from a network of underground tunnels in the Berkeley area, jumping a chain-link fence to get to the source. But Mr. Holtzman says his roommates are “really uncomfortable” with him working with sewage water, so he’s trying to find another source of bacteriophage.

For the full story, see:
JEANNE WHALEN. “In Attics and Closets, ‘Biohackers’ Discover Their Inner Frankenstein; Using Mail-Order DNA and Iguana Heaters, Hobbyists Brew New Life Forms; Is It Risky?” Wall Street Journal (Tues., May 12, 2009): A1 & A14.
(Note: ellipses added.)

“Build a Wall Around the Welfare State”

For a long time, I’ve been meaning to post a pithy comment on immigration policy from the Cato Institutes’s Bill Niskanen.
The comment was related to the proposal to erect a wall between the United States and Mexico, in order to reduce illegal immigration. Some libertarians favor open immigration. Others believe that so long as we have a large welfare state, open immigration would impose high costs on the taxpayer, and thereby reduce economic growth. (I believe that I read Milton Friedman supporting this latter position, in the year or two before he died in 2006.)
In this context, Niskanen’s pithy comment has appeal:

“Build a wall around the welfare state, not around the country.”

Source:
William A. Niskanen on 11/19/07 at the meetings of the Southern Economic Association in New Orleans.

Time Diary Studies Show Most Work Fewer Hours than Reported

OverworkLongNoseCartoon.jpg

Source of caricature: online version of the WSJ article quoted and cited below.

(p. W13) Sociologists have been studying how Americans spend their time for decades. One camp favors a simple approach: if you want to know how many hours someone works, sleeps or vacuums, you ask him. Another camp sees a flaw in this method: People lie. We may not do so maliciously, but it’s tough to remember our exact workweek or average time spent dishwashing, and in the absence of concrete memories, we’re prone to lie in ways that don’t disappear into the randomness of thousands of answers. They actually skew results.

That’s the theory behind the American Time Use Survey, conducted annually by the Bureau of Labor Statistics. The ATUS, like a handful of previous academic surveys, is a “time diary” study. For these studies, researchers either walk respondents through the previous day, asking them what they did next and reminding them of the realities of time and physics, or in some cases giving them a diary to record the next day or week.
Time-diary studies are laborious, but in general they are more accurate. Aggregated, they paint a different picture of life than the quick-response surveys featured in the bulk of America’s press releases. For instance, the National Sleep Foundation claims that Americans sleep 6.7 hours (weekdays) to 7.1 hours (weekends) per night. The ATUS puts the average at 8.6 hours. The first number suggests rampant sleep deprivation. The latter? Happy campers.
The numbers are equally striking with work. Back in the 1990s, using 1985 data, researchers John Robinson and his colleagues compared people’s estimated workweeks with time-diary hours. They found that, on average, people claiming to work 40 to 44 hours per week were working 36.2 hours — not far off. But then, as estimated work hours rose, reality and perception diverged more sharply. You can guess in which direction. Those claiming to work 60- to 64-hour weeks actually averaged 44.2 hours. Those claiming 65- to 74-hour workweeks logged 52.8 hours, and those claiming workweeks of 75 hours or more worked, on average, 54.9 hours. I contacted Prof. Robinson recently to ask for an update. His 2006-07 comparisons were tighter — but, still, people claiming to work 60 to 69 hours per week clocked, on average, 52.6 hours, while those claiming 70-, 80-hour or greater weeks logged 58.8. As Mr. Robinson and co-author Geoffrey Godbey wrote in their 1997 book “Time for Life,” “only rare individuals put in more than a 55-60 hour workweek.”

For the full commentary, see:
LAURA VANDERKAM. “Overestimating Our Overworking.” Wall Street Journal (Sat., May 29, 2009): W13.