Biofuels Are Bad for the Planet

(p. A13) Biofuels are under siege from critics who say they crowd out food production. Now these fuels made from grass and grain, long touted as green, are being criticized as bad for the planet.
At issue is whether oil alternatives — such as ethanol distilled from corn and fuels made from inedible stuff like switch grass — actually make global warming worse through their indirect impact on land use around the world.
For example, if farmers in Brazil burn and clear more rainforest to grow food because farmers in the U.S. are using their land to grow grain for fuel, that could mean a net increase in emissions of carbon dioxide, the main “greenhouse gas” linked to climate change.
. . .
A study published in February [2008] in the journal Science found that U.S. production of corn-based ethanol increases emissions by 93%, compared with using gasoline, when expected world-wide land-use changes are taken into account. Applying the same methodology to biofuels made from switch grass grown on soil diverted from raising corn, the study found that greenhouse-gas emissions would rise by 50%.
Previous studies have found that substituting biofuels for gasoline reduces greenhouse gases. Those studies generally didn’t account for the carbon emissions that occur as farmers world-wide respond to higher food prices and convert forest and grassland to cropland.

For the full story, see:
STEPHEN POWER. “If a Tree Falls in the Forest, Are Biofuels To Blame? It’s Not Easy Being Green.” The Wall Street Journal (Tues., November 11, 2008): A13.
(Note: ellipsis, and bracketed year, added.)

Two relevant articles appeared in Science in the Feb. 29, 2008 issue:
Fargione, Joseph, Jason Hill, David Tilman, Stephen Polasky, and Peter Hawthorne. “Land Clearing and the Biofuel Carbon Debt.” Science 319, no. 5867 (Feb. 29, 2008): 1235-38.
Searchinger, Timothy, Ralph Heimlich, R. A. Houghton, Fengxia Dong, Amani Elobeid, Jacinto Fabiosa, Simla Tokgoz, Dermot Hayes, and Tun-Hsiang Yu. “Use of U.S. Croplands for Biofuels Increases Greenhouse Gases through Emissions from Land-Use Change.” Science 319, no. 5867 (Feb. 29, 2008): 1238-40.

Students Learn More in Air Conditioning

(p. 5) My first year as a public school teacher, I taught at Manhattan’s P.S. 98, which did not have air-conditioning. From mid-May until June’s end — roughly 17 percent of the school year — the temperature in my classroom hovered in the 80s and often topped 90 degrees.
Students wilted over desks. Academic gains evaporated. Even restless pencil tappers and toe wigglers grew lethargic. Absenteeism increased as children sought relief at home or outdoors. By day’s end, my hair was plastered to my face with perspiration.
It seems obvious: schools need to be cool. It’s absurd to talk about inculcating 21st-century skills in classrooms that resemble 19th-century sweatshops.
. . .
Cool schools are critical if we are to boost achievement. Studies show that concentration and cognitive abilities decline substantially after a room reaches 77 or 78 degrees. This is a lesson American businesses learned long ago. . . . A pleasant atmosphere leads to more productive employees.
. . .
It isn’t just white-collar laborers who work in cool climates. Amazon announced last year that it was spending $52 million to upgrade its warehouses with air-conditioning. Yet we can’t seem to do the same for vulnerable children, though some of the achievement gap is most likely owing to a lack of air-conditioning. One Oregon study found that students working in three different temperature settings had strikingly different results on exams, suggesting that sweating a test actually undermines performance.
Students who enjoy the luxury of air-conditioning may enjoy an unfair advantage over their hotter peers.
We are also investing enormous sums to extend the school day and school year in many locales. But these investments won’t be effective if schools are ovens.

For the full commentary, see:
SARA MOSLE. “SCHOOLING; Schools Are Not Cool.” The New York Times, SundayReview Section (Sun., June 2, 2013): 5.
(Note: ellipses added.)
(Note: the online version of the commentary has the date June 1, 2013.)

The Precautionary Principle Is Biased Against the New, and Ignores the Risks of the Old

(p. 250) In general the Precautionary Principle is biased against anything new. Many established technologies and “natural” processes have unexamined faults as great as those of any new technology. But the Precautionary Principle establishes a drastically elevated threshold for things that are new. In effect it grandfathers in the risks of the old, or the “nat-(p. 251)ural.” A few examples: Crops raised without the shield of pesticides generate more of their own natural pesticides to combat insects, but these indigenous toxins are not subject to the Precautionary Principle because they aren’t “new.” The risks of new plastic water pipes are not compared with the risks of old metal pipes. The risks of DDT are not put in context with the old risks of dying of malaria.

Source:
Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

Millions Die Due to Precautionary Principle Ban of DDT

(p. 248) . . . , malaria infects 300 million to 500 million people worldwide, causing 2 million deaths per year. It is debilitating to those who don’t die and leads to cyclic poverty. But in the 1950s the level of malaria was reduced by 70 percent by spraying the insecticide DDT around the insides of homes. DDT was so successful as an insecticide that farmers eagerly sprayed it by the tons on cotton fields–and the molecule’s by-products made their way into the water cycle and eventually into fat cells in animals. Biologists blamed it for a drop in reproduction rates for some predatory birds, as well as local die-offs in some fish and aquatic life species. Its use and manufacture were banned in the United States in 1972. Other countries followed suit. Without DDT spraying, however, malaria cases in Asia and Africa began to rise again to deadly pre-1950s levels. Plans to reintroduce programs for household spraying in malarial Africa were blocked by the World Bank and other aid agencies, who refused to fund them. A treaty signed in 1991 by 91 countries and the EU agreed to phase out DDT altogether. They were relying on the precautionary principle: DDT was probably bad; better safe than sorry. In fact DDT had never been shown to hurt humans, and the environmental harm from the miniscule amounts of DDT applied in homes had not been measured. But nobody could prove it did not cause harm, despite its proven ability to do good.

Source:
Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.
(Note: ellipsis added.)

Why Wind Power Has Not, and Will Not, Replace a Single Conventional Power Plant

(p. A17) After decades of federal subsidies–almost $24 billion according to a recent estimate by former U.S. Sen. Phil Gramm–nowhere in the United States, or anywhere else, has an array of wind turbines replaced a single conventional power plant. Nowhere.
But wind farms do take up space. The available data from wind-power companies, with which the Environmental Protection Agency agrees, show that the most effective of them can generate about five kilowatts per acre. This means 300 square miles of land–192,000 acres–are necessary to generate the 1,000 megawatts (a billion watts) of electricity that a conventional power plant using coal, nuclear energy or natural gas can generate on a few hundred acres. A billion watts fulfills the average annual power demand of a city of 700,000.
. . .
The promise that wind and solar power could replace conventional electricity production never really made sense. It’s known to everybody in the industry that a wind turbine will generate electricity 30% of the time–but it’s impossible to predict when that time will be. A true believer might be willing to do without electricity when the wind is not blowing, but most people will not. And so, during the 30% of the time the blades are spinning, conventional power plants are also spinning on low, waiting to operate during the other 70% of the time.

For the full commentary, see:
JAY LEHR. “OPINION; The Rationale for Wind Power Won’t Fly; Physical limitations will keep this energy source a niche provider of U.S. electricity needs.” The Wall Street Journal (Tues., June 18, 2013): A17.
(Note: ellipsis added.)
(Note: the online version of the commentary has the date June 17, 2013.)

Mainstream Climatologists Lower Best Guess Estimates of Global Warming (and Find High End Estimates “Pretty Implausible”)

(p. D1) Since 1896, scientists have been trying to answer a deceptively simple question: What will happen to the temperature of the earth if the amount of carbon dioxide in the atmosphere doubles?
Some recent scientific papers have made a splash by claiming that the answer might not be as bad as previously feared. This work — if it holds up — offers the tantalizing possibility that climate change might be slow and limited enough that human society could adapt to it without major trauma.
. . .
In 1979, after two decades of meticulous measurements had made it clear that the carbon dioxide level was indeed rising, scientists used computers and a much deeper understanding of the climate to calculate a likely range of warming. They found that the response to a doubling of carbon dioxide would not be much below three degrees Fahrenheit, nor was it likely to exceed eight degrees.
In the years since, scientists have been (p. D6) pushing and pulling within that range, trying to settle on a most likely value. Most of those who are expert in climatology subscribe to a best-estimate figure of just over five degrees Fahrenheit.
. . .
What’s new is that several recent papers have offered best estimates for climate sensitivity that are below four degrees Fahrenheit, rather than the previous best estimate of just above five degrees, and they have also suggested that the highest estimates are pretty implausible.
Notice that these recent calculations fall well within the long-accepted range — just on the lower end of it.

For the full story, see:
JUSTIN GILLIS. “BY DEGREES; A Change in Temperature.” The New York Times (Tues., May 14, 2013): D1 & D6.
(Note: ellipses added.)
(Note: the online version of the article has the date May 13, 2013.)

We Should Disenthrall Ourselves of False Scientific Certainties

An Optimists Tour of the Future CoverBK2013-06-21.jpg

Source of book image: http://2.bp.blogspot.com/-ELpfH2bTO7c/Tb53WpKuDxI/AAAAAAAADrE/Zq8BQiiasJc/s640/An+Optimists+Tour+of+the+Future+Cover.jpg

(p. C4) Among the scientific certainties I have had to unlearn: that upbringing strongly shapes your personality; that nurture is the opposite of nature; that dietary fat causes obesity more than dietary carbohydrate; that carbon dioxide has been the main driver of climate change in the past.

I came across a rather good word for this kind of unlearning–“disenthrall”–in Mark Stevenson’s book “An Optimist’s Tour of the Future,” published just this week. Mr. Stevenson borrows it from Abraham Lincoln, whose 1862 message to Congress speaks of disenthralling ourselves of “the dogmas of the quiet past” in order to “think anew.”
Mr. Stevenson’s disenthrallment comes in the course of a series of sharp and fascinating interviews with technological innovators and scientific visionaries. This disenthralls him of the pessimism about the future and nostalgia about the past that he barely realized he had and whose “fingers reach deep into [his] soul.” It eventually turns him into an optimist almost as ludicrously sanguine about the 21st century as I am: “I steadfastly refuse to believe that human society can’t grow, improve and learn; that it can’t embrace change and remake the world better.”
Along the way, Mr. Stevenson is struck by other examples of how the way he thinks and reasons is “in thrall to a world that is passing.” The first of these bad habits is linear thinking about the future. . . .
We expect to see changes coming gradually, but because things like computing power or the cheapness of genome sequencing change exponentially, technologies can go from impossible to cheap quite suddenly and with little warning.

For the full commentary, see:
MATT RIDLEY. “MIND & MATTER; A Key Lesson of Adulthood: The Need to Unlearn.” The Wall Street Journal (Sat., February 5, 2011): C4.
(Note: ellipsis added.)

The book praised by Ridley, in the passages quoted above, is:
Stevenson, Mark. An Optimist’s Tour of the Future: One Curious Man Sets out to Answer “What’s Next?”. New York: Avery, 2011.

Nate Silver “Chides Environmental Activists for Their Certainty”

TheSignalAndTheNoiseBK2013-05-13.jpg

Source of book image: http://si.wsj.net/public/resources/images/OB-US032_bkrvno_GV_20120924132722.jpg

(p. 12) In recent years, the most sophisticated global-warming skeptics have seized on errors in the forecasts of the United Nations’ International Panel on Climate Change (I.P.C.C.) in order to undermine efforts at greenhouse gas reduction. These skeptics note that global temperatures have increased at only about half the rate the I.P.C.C. predicted in 1990, and that they flatlined in the 2000s (albeit after rising sharply in the late ’90s).

Silver runs the numbers to show that the past few decades of data are still highly consistent with the hypothesis of man-made global warming. He shows how, at the rate that carbon dioxide is accumulating, a single decade of flat temperatures is hardly invalidating. On the other hand, Silver demonstrates that projecting temperature increases decades into the future is a dicey proposition. He chides some environmental activists for their certainty — observing that overambitious predictions can undermine a cause when they don’t come to pass . . .

For the full review, see:
NOAM SCHEIBER. “Known Unknowns.” The New York Times Book Review (Sun., November 4, 2012): 12.
(Note: ellipsis added.)
(Note: the online version of the review has the date November 2, 2012.)

The book under review, is:
Silver, Nate. The Signal and the Noise: Why So Many Predictions Fail — but Some Don’t. New York: The Penguin Press, 2012.

Amish Factory Uses Pneumatics in Place of Electricity

(p. 219) The Amish also make a distinction between technology they have at work and technology they have at home. I remember an early visit to an Amish man who ran a woodworking shop near Lancaster, Pennsylvania. . . .
. . .
(p. 220) While the rest of his large workshop lacked electricity beyond that naked bulb, it did not lack power machines. The place was vibrating with an ear-cracking racket of power sanders, power saws, power planers, power drills, and so on. Everywhere I turned there were bearded men covered in sawdust pushing wood through screaming machines. This was not a circle of Renaissance craftsman hand-tooling masterpieces. This was a small-time factory cranking out wooden furniture with machine power. But where was the power coming from? Not from windmills.
Amos took me around to the back where a huge SUV-sized diesel generator sat. It was massive. In addition to a gas engine there was a very large tank, which, I learned, stored compressed air. The diesel engine burned petroleum fuel to drive the compressor that filled the reservoir with pressure. From the tank, a series of high-pressure pipes snaked off toward every corner of the factory. A hard rubber flexible hose connected each tool to a pipe. The entire shop ran on compressed air. Every piece of machinery was running on pneumatic power. Amos even showed me a pneumatic switch, which he could flick like a light switch to turn on some paint-drying fans running on air.
The Amish call this pneumatic system “Amish electricity.” At first, pneumatics were devised for Amish workshops, but air power was seen as so useful that it migrated to Amish households. In fact, there is an entire cottage industry in retrofitting tools and appliances to run on Amish electricity. The retrofitters buy a heavy-duty blender, say, and yank out the electrical motor. They then substitute an air-powered motor of appropriate size, add pneumatic connectors, and bingo, your Amish mom now has a blender in her electricity-less kitchen. You can get a pneumatic sewing machine and a pneumatic washer/dryer (with propane heat). In a display of pure steam-punk (air-punk?) nerdiness, Amish hackers try to outdo one another in building pneumatic versions of electrified contraptions. Their mechanical skill is quite impressive, particularly since none went to school beyond the eighth grade. They (p. 221) love to show off their geekiest hacks. And every tinkerer I met claimed that pneumatics were superior to electrical devices because air was more powerful and durable, outlasting motors that burned out after a few years of hard labor. I don’t know if this claim of superiority is true or merely a justification, but it was a constant refrain.

Source:
Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.
(Note: ellipses added.)

If Anarcho-Primitives Destroy Civilization, Billions of City-Dwellers Will Die

(p. 211) . . . , the . . . problem with destroying civilization as we know it is that the alternative, such as it has been imagined by the self-described “haters of civilization,” would not support but a fraction of the people alive today. In other words, the collapse of civilization would kill billions. Ironically, the poorest rural inhabitants would fare the best, as they could retreat to hunting and gathering with the least trouble, but billions of urbanites would die within months or even weeks, once food ran out and disease took over. The anarcho-primitives are rather sanguine about this catastrophe, arguing that accelerating the collapse early might save lives in total.

Source:
Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.
(Note: ellipses added.)

Organic Animals Cause More Global Warming than Non-Organic Animals

JustFoodBK2013-05-01.jpg

Source of book image: http://si.wsj.net/public/resources/images/OB-EH374_justfo_DV_20090821150506.jpg

(p. A23) Grass-grazing cows emit considerably more methane than grain-fed cows. Pastured organic chickens have a 20 percent greater impact on global warming. It requires 2 to 20 acres to raise a cow on grass. If we raised all the cows in the United States on grass (all 100 million of them), cattle would require (using the figure of 10 acres per cow) almost half the country’s land (and this figure excludes space needed for pastured chicken and pigs). A tract of land just larger than France has been carved out of the Brazilian rain forest and turned over to grazing cattle. Nothing about this is sustainable.

Advocates of small-scale, nonindustrial alternatives say their choice is at least more natural. Again, this is a dubious claim. Many farmers who raise chickens on pasture use industrial breeds that have been bred to do one thing well: fatten quickly in confinement. As a result, they can suffer painful leg injuries after several weeks of living a “natural” life pecking around a large pasture. Free-range pigs are routinely affixed with nose rings to prevent them from rooting, which is one of their most basic instincts. In essence, what we see as natural doesn’t necessarily conform to what is natural from the animals’ perspectives.

For the full commentary, see:
JAMES E. McWILLIAMS. “The Myth of Sustainable Meat.” The New York Times (Fri., April 13, 2012): A23.
(Note: the online version of the commentary has the date April 12, 2012.)

McWilliams’ book on related issues, is:
McWilliams, James E. Just Food: Where Locavores Get It Wrong and How We Can Truly Eat Responsibly. New York: Little, Brown and Company, 2009.