Maybe Fewer Women Engineers Because Fewer Women Want to Be Engineers?

I’ve slogged through enough reports from the National Academy of Sciences to know they’re often not shining examples of the scientific method.  But — call me naïve — I never thought the academy was cynical enough to publish a political tract like “Beyond Bias and Barriers,” the new report on discrimination against female scientists and engineers.

. . .

I consulted half a dozen of these experts about the report, and they all dismissed it as a triumph of politics over science.  It’s classic rent-seeking by a special-interest group that stands to get more money and jobs if the recommendations are adopted.

“I am embarrassed,” said Linda Gottfredson of the University of Delaware, “that this female-dominated panel of scientists would ignore decades of scientific evidence to justify an already disproved conclusion, namely, that the sexes do not differ in career-relevant interests and abilities.”

. . .

After decades of schools pushing girls into science and universities desperately looking for gender diversity on their faculties, it’s insulting to pretend that most female students are too intimidated to know their best interests.  As Science magazine reported in 2000, the social scientist Patti Hausman offered a simple explanation for why women don’t go into engineering:  they don’t want to.

“Wherever you go, you will find females far less likely than males to see what is so fascinating about ohms, carburetors or quarks,” Hausman said.  “Reinventing the curriculum will not make me more interested in learning how my dishwasher works.”

 

For the full commentary, see:

JOHN TIERNEY.  "Academy of P.C. Science."  The New York Times   (Tues., September 26, 2006):  A23.

 

(Note:  the title of the online version was "Academy of P.C. Sciences.")

(Note:  ellipses added.) 

“Work Alone”

  Source of book image:  http://www.mactime.ru/Environ/WebObjects/mactime.woa/2/wa/Main?textid=6114&level1=mactimes&wosid=b2qk07iEkIh6GoutH7IbVg

 

Many scholars interpret Schumpeter as believing that large firms would increasingly become the main source of innovation.  Scherer, Christensen, and many others, have provided plenty of reason to doubt this belief.  Here is another reason, from one of the innovators who helpted bring us the personal computer:

What emerges in "iWoz" is a chatty memoir full of surprises.  Yes, Mr. Wozniak cherishes workbench minutiae, such as his tips for connecting circuitry wires.  But he also sees this book as a chance to cut through cliché and explain himself to a larger audience.  He reveals a technology pioneer who is more charming and annoying — and whose life is more poignant — than we expected.

. . .

As Apple roared ahead, going public in 1980 and then becoming one of the 500 largest U.S. companies, Mr. Wozniak’s golden moment came to an end.  New products weren’t developed anymore by a brilliant prankster working with barely any sleep.  There were now teams, committees and market studies.

Mr. Wozniak by his own account didn’t like these changes, and he didn’t want to rise into senior management.  He hung on at Apple as a lone engineer — and he says he still collects a tiny paycheck from the company — but from the mid-1980s onward turned his attention to other things.

. . .

Fortunately, Mr. Wozniak finishes strong.  In his final chapter, he offers a bit of advice to gifted engineers:  "Work alone."  Big companies tend to stifle innovation, he explains.  It’s lonely and risky to work solo.  No matter.  "Man, it will be worth it in the end," he writes.  His life bears out the truth of that simple claim.

 

For the full review, see: 

GEORGE ANDERS.  "BOOKS; Technostalgia; Steve Wozniak looks back on the computer revolution and his role as Apple’s co-founder."  Wall Street Journal  (Sat., September 30, 2006):  P8.

(Note:  ellipses added.)

 

The reference to the book by Wozniak: 

Steve Wozniak, with Gina Smith.  iWoz  Norton, 2006.  313 pages, $25.95.

 

JobsWozniak1977.jpg  Steve Jobs at left, and Steve Wozniak at right, in San Francisco in 1977.  Source of photo:  online version of the WSJ article cited above.

Intel Chairman Says Health Care Inefficient

 

WASHINGTON (AP) – Intel Corp. Chairman Craig Barrett said Tuesday that U.S. jobs will continue to move offshore at a rapid pace unless corporate America forces the health care industry to adopt systems that will cut costs and improve efficiency.

"Every job that can be moved out of the United States will be moved out . . . because of health care costs," which averaged more than $6,000 per person in 2004, Barrett said at a conference sponsored by eHealth Initiative, a nonprofit coalition of health information technology interest groups.

. . .

Barrett was joined on-stage by Wal-Mart Stores Inc. Executive VP Linda Dillman.  Barrett said the health care industry could learn from the efficiency of the retail giant, which tracks every item in inventory.

 

For the full story, see: 

"Health care waste costs jobs, says Intel chief."  Omaha World-Herald  (Wednesday,  September 27, 2006):  3D. 

(Note:  ellipsis in the Barrett quote, in original; ellipsis between paragraphs, added.)

 

Tech Bubble Caused Much of 1990s Inequality Increase

  Source of graphic:  online version of the NYT article cited below.

 

It is widely recognized that income inequality increased in the 1990’s, but nobody knows quite why. Despite the lack of hard evidence, there are plenty of theories.

. . .

Two University of Texas researchers, James K. Galbraith and Travis Hale, added an interesting twist to this debate in a paper, “Income Distribution and the Information Technology Bubble” (utip.gov.utexas.edu/abstract.html#UTIP27).

According to Mr. Galbraith and Mr. Hale, much of the increase in income inequality in the late 1990’s resulted from large income changes in just a handful of locations around the country — precisely those areas that were heavily involved in the information technology boom.

. . .

A big advantage of looking at county data is that it is possible to identify counties that contributed the most to the increase in income inequality from 1994 to 2000.  It turns out that the five biggest winners in this period were New York; King County, Wash. (with both Seattle and Redmond); and Santa Clara, San Mateo and San Francisco, Calif., the counties that make up Silicon Valley.  The five biggest losers were Los Angeles; Queens; Honolulu; Broward, Fla.; and Cuyahoga, Ohio.

What do the counties in the first list have in common?  Their economies were all heavily driven by information technology in the late 90’s.  This is true for the rest of the list of winners as well.  Harris, Tex. (home to Houston and Enron); Middlesex, Mass. (home to Harvard and M.I.T.); Fairfield, Conn.; Alameda, Calif.; and Westchester, N.Y., were also among the top 10 income gainers in this period.

The authors point out that half the 80 American companies in the CNET Tech Index are in those top 10 counties.  Furthermore, when income inequality decreased after 2000, the income drop in the high-tech counties contributed most to the decline. 

 

For the full commentary, see:

HAL R. VARIAN.  "ECONOMIC SCENE; Many Theories on Income Inequality, but One Answer Lies in Just a Few Places."  The New York Times  (Thurs., September 21, 2006):   C3.

On Turning 60, Michael Milken Asks “Why Retire?”

More baby boomers are asking themselves, Why retire?  It’s a cliché to say that 60 is the new 40, but it has some biological and psychological validity.  Advanced biomedical research is leading to continued progress against cancer, heart disease, arthritis, dementia and other conditions that forced people out of the workforce before they wanted to quit.  In the future, aging workers will be healthier and will use broadband technology to live and work from anywhere at the increasing proportion of jobs that involve knowledge rather than physical labor.  They’ll spend more years earning income, often in multiple careers, instead of selling assets.

Fewer people will retire in their 60s simply because they know that average life expectancy at birth is increasing at an astounding rate.  Americans, who could expect to live an average of 47 years in 1900, now enjoy life spans approaching 80 years.  (It already exceeds 80 for women.)  An American who makes it to age 65 can look forward to living almost two decades more.  Worldwide, the increase has been even more dramatic.  In a single century — despite wars, AIDS and other scourges — the global average more than doubled to 66 years.  Nobel laureate Robert Fogel believes it will exceed 100 years within this century.

More than just the length of life, the number of healthy years will also increase.  When people are vibrant into their 80s and 90s, 65 will evolve from the traditional retirement age to a mid-career milestone for those who choose to keep working.  Who wants to retire when you have fulfilling work, when you earn a good income, and when you feel great?  According to a Yahoo! poll, 70% of people over 55 say it’s never too late to start a new business.

 

For the full commentary, see:

MICHAEL MILKEN.  "The Boom Generation Seventh Decade."  Wall Street Journal  (Tues., September 19, 2006):  A20.

“An Image Was Worth a 1,000 Statistical Tables”


HandWithGerms.jpg  Artistic vision of germ-laden hand.  (This is not the photographic image mentioned below, and used as a hospital screen-saver.)  Source of image:  online version of the NYT article cited below.

 

(p. 22)  Leon Bender noticed something interesting: passengers who went ashore weren’t allowed to reboard the ship until they had some Purell squirted on their hands.  The crew even dispensed Purell to passengers lined up at the buffet tables.  Was it possible, Bender wondered, that a cruise ship was more diligent about killing germs than his own hospital?

Cedars-Sinai Medical Center, where Bender has been practicing for 37 years, is in fact an excellent hospital.  But even excellent hospitals often pass along bacterial infections, thereby sickening or even killing the very people they aim to heal.  In its 2000 report “To Err Is Human,” the Institute of Medicine estimated that anywhere from 44,000 to 98,000 Americans die each year because of hospital errors — more deaths than from either motor-vehicle crashes or breast cancer — and that one of the leading errors was the spread of bacterial infections.

. . .

. . . the hospital needed to devise some kind of incentive scheme that would increase compliance without alienating its doctors.  In the beginning, the administrators gently cajoled the doctors with e-mail, (p. 23) faxes and posters.  But none of that seemed to work.  (The hospital had enlisted a crew of nurses to surreptitiously report on the staff’s hand-washing.)  “Then we started a campaign that really took the word to the physicians where they live, which is on the wards,” Silka recalls.  “And, most importantly, in the physicians’ parking lot, which in L.A. is a big deal.”

For the next six weeks, Silka and roughly a dozen other senior personnel manned the parking-lot entrance, handing out bottles of Purell to the arriving doctors.  They started a Hand Hygiene Safety Posse that roamed the wards and let it be known that this posse preferred using carrots to sticks:  rather than searching for doctors who weren’t compliant, they’d try to “catch” a doctor who was washing up, giving him a $10 Starbucks card as reward.  You might think that the highest earners in a hospital wouldn’t much care about a $10 incentive — “but none of them turned down the card,” Silka says.

When the nurse spies reported back the latest data, it was clear that the hospital’s efforts were working — but not nearly enough.  Compliance had risen to about 80 percent from 65 percent, but the Joint Commission required 90 percent compliance.

These results were delivered to the hospital’s leadership by Rekha Murthy, the hospital’s epidemiologist, during a meeting of the Chief of Staff Advisory Committee.  The committee’s roughly 20 members, mostly top doctors, were openly discouraged by Murthy’s report.  Then, after they finished their lunch, Murthy handed each of them an agar plate — a sterile petri dish loaded with a spongy layer of agar.  “I would love to culture your hand,” she told them.

They pressed their palms into the plates, and Murthy sent them to the lab to be cultured and photographed.  The resulting images, Silka says, “were disgusting and striking, with gobs of colonies of bacteria.”

The administration then decided to harness the power of such a disgusting image.  One photograph was made into a screen saver that haunted every computer in Cedars-Sinai.  Whatever reasons the doctors may have had for not complying in the past, they vanished in the face of such vivid evidence.  “With people who have been in practice 25 or 30 or 40 years, it’s hard to change their behavior,” Leon Bender says.  “But when you present them with good data, they change their behavior very rapidly.”  Some forms of data, of course, are more compelling than others, and in this case an image was worth 1,000 statistical tables.  Hand-hygiene compliance shot up to nearly 100 percent and, according to the hospital, it has pretty much remained there ever since.

 

For the full commentary, see:

STEPHEN J. DUBNER and STEVEN D. LEVITT.  "FREAKONOMICS; Selling Soap."  The New York Times Magazine (Section 6)  (Sunday, September 24, 2006):  22-23.

(Note:  ellipses added.)

 

      The screen-saver at Cedars Sinai Hospital.  Source of image:  http://freakonomics.com/pdf/CedarsSinaiScreenSaver.jpg

Life Is Better, But Could Be Better Still

  November 9, 1952 NYT ad announcing the introduction of the snowblower.  Source of image:  online version of the NYT article cited below.

 

(p. C1)  When the first snow falls on the North Shore of Chicago this winter, Robert Gordon will take his Toro snow blower out of the garage and think about how lucky he is not to be using a shovel.  Mr. Gordon is 66 years old and evidently quite healthy, but his doctor has told him that he should never clear his driveway with his own hands.  “People can die from shoveling snow,” Mr. Gordon said.  “I bet a lot of lives have been saved by snow blowers.”

If so, most of them have been saved in the last few decades.  A Canadian teenager named Arthur Sicard came up with the idea for the snow blower in the late 1800’s, while watching the blades on a piece of farm equipment, but he didn’t sell any until 1927.  For the next 30 years or so, snow blowers were hulking machines typically bought by cities and schools.  Only recently have they become a suburban staple.

Yet the benefits of the snow blower, namely more free time and less health risk, are largely missing from the government’s attempts to determine Americans’ economic well-being.  The same goes for dozens of other inventions, be they air-conditioners, cellphones or medical devices.  The reasons are a little technical — they involve the measurement of inflation — but they’re important to understand, because the implications are so large.

. . .

(p. C10)  In the early 1950’s, Toro began selling mass-market snow blowers, which weighed up to 500 pounds and cost at least $150.  As far as the Bureau of the Labor Statistics was concerned, however, snow blowers did not exist until 1978.  That was the year when the machines began to be counted in the Consumer Price Index, the source of the official inflation rate.  By then, the cheapest model sold for about $100.

In practical terms, this was an enormous price decline compared with the 1950’s, because incomes had risen enormously over this period.  Yet the price index completely missed it and, by doing so, overstated inflation.  It counted the rising cost of cars and groceries but not the falling cost of snow blowers.

. . .

Mr. Gordon, besides being a fan of snow blowers, also happens to be one of the country’s leading macroeconomists.  A decade ago he served on a government-appointed group known as the Boskin Commission.  It argued, as Mr. Gordon still does, that the government exaggerated inflation by more than one percentage point every year.

. . .

. . .  Mr. Gordon’s adjustments show that men actually got a 27 percent raise in this period and women 65 percent.  The gains are not as big as those of the 1950’s and 60’s, but they do sound far more realistic than the official numbers.  Think about it:  we live longer than people did in the 1970’s, we’re healthier while alive, we graduate from college in much greater numbers, we’re surrounded by new gadgets and we live in bigger houses.  Is it really plausible, as some Democrats claim, that the middle class has made only marginal progress?

 

For the full commentary, see: 

DAVID LEONHARDT.  "Economix; Life Is Better; It Isn’t Better. Which Is It?"  The New York Times  (Weds., September 20, 2006):  C1 & C10.

(Note:  ellipsis added.)

 

 PayTwoViewsGraph.gif  Source of graphic:  online version of the NYT article cited above.

Wal-Mart Really Does Benefit Consumers by Lowering Prices

 

Scholarly studies show Wal-Mart’s price reductions to be sizable.  Economist Emek Basker of the University of Missouri found long-term reductions of 7 to 13 percent on items such as toothpaste, shampoo and detergent.  Other companies are forced to reduce their prices.  On food, Wal-Mart produces consumer savings that average 20 percent, estimate Jerry Hausman of the Massachusetts Institute of Technology and Ephraim Leibtag of the Agriculture Department.

All told, these cuts have significantly raised living standards.  How much is unclear.  A study by the economic consulting firm Global Insight found that from 1985 to 2004, Wal-Mart’s expansion lowered the consumer price index by a cumulative 3.1 percent from what it would have been.  That produced savings of $263 billion in 2004, equal to $2,329 for each U.S. household.  Because Wal-Mart financed this study, its results have been criticized as too high.  But even if price savings are only half as much ($132 billion and $1,165 per household), they’d dwarf the benefits of all but the biggest government programs. 

 

For the full commentary, see:

Robert J. Samuelson.  "Wal-Mart as Red Herring."  The Washington Post  (Wednesday, August 30, 2006):  A19.

 

Daley Shows Chicago is Still the “City of the Outstuck Neck”

I think it was the poet Gwendolyn Brooks who once described Chicago as the "city of the out-stuck neck."  Chicago’s current Mayor Daley did himself and the city proud recently when he had the guts to stick his neck out by vetoing the proposed Chicago minimum wage. He deserves a salute from Chicago’s consumers and poor.  Democrat Daley is the mayor of the out-stuck neck.

 

Chicago Mayor Richard M. Daley used the first veto of his 17-year tenure to reject a living-wage ordinance aimed at forcing big retailers to pay wages of $10 an hour and health benefits equivalent to $3 an hour by 2010.

The veto is important to Wal-Mart Stores Inc., which plans to open its first store in Chicago late this month in the economically depressed 37th ward.

. . .

In vetoing the ordinance, Mayor Daley cited a potential loss of jobs.  In recent weeks, several big retailers had written to his office to oppose the ordinance.  "I understand and share a desire to ensure that everyone who works in the city of Chicago earns a decent wage," the mayor wrote to the aldermen yesterday.  "But I do not believe that this ordinance, well intentioned as it may be, would achieve that end.  Rather, I believe that it would drive jobs and business from our city."

 

For the full story, see: 

KRIS HUDSON.  "Chicago’s Daley Vetoes Bill Aimed At Big Retailers."   Wall Street Journal  (Thurs.,   September 12, 2006):  A4.

 

(Note:  I can’t find the exact source of the out-stuck neck quote, but one reference on the web is:  http://starbulletin.com/97/05/22/sports/fitzgerald.html )

 

Added Evidence for Weidenbaum’s ‘Birth Dearth’

 

BirthDearthBK.gif Source of book image:  http://www.aei.org/books/bookID.497,filter.all/book_detail.asp

 

Ben Wattenberg had already been predicting a world population decline for years, when he published The Birth Dearth in 1987.  Back then, scepticism was widespread.  Governments and philanthropists spent billions promoting birth control to restrain population growth.  Many were still convinced of the wisdom of Isaac Ehrlich, darling of the environmentalist enemies of economic growth, who had predicted disaster in his Population Bomb.

(Note that the plausibility of many environmentalist disaster scenerios is based on the assumption of continuous population growth.) 

The current decline in birth rates is not a total puzzle.  Nobel-prize winner Gary Becker long-ago claimed that quality of children is what economists call a ‘normal’ good, which means that families invest more in quality as their incomes rise.  As families invest more in quality, they invest less in quantity.

Whatever the reasons, the evidence continues to accumulate that Wattenberg was right:

 

After a long decline, birthrates in European countries have reached a historic low, as potential parents increasingly opt for few or no children.  European women, better educated and integrated into the labor market than ever before, say there is no time for motherhood and that children are too expensive anyway.

The result is a continent of lopsided societies where the number of elderly increasingly exceeds the number of young — a demographic pattern that is straining pension plans and depleting the work force in many countries.

 

For the full story, see:

ELISABETH ROSENTHAL.  "European Union’s Plunging Birthrates Spread Eastward."  The New York Times   (Mon., September 4, 2006):  A3.

 

 EuropeanBirthratesGraph.gif  Source of graphic:  online version of the NYT article cited above.

 

Welfare Reform Increases Number Employed

WelfareSingleMotherTrends.gif Source of graphic:  online version of the NYT article cited below.

 

WASHINGTON, Aug. 20 — Ten years after a Republican Congress collaborated with a Democratic president to overhaul the nation’s welfare system, the implications are still rippling through policy and politics.

The law, which reversed six decades of social welfare policy and ended the idea of free cash handouts for the poor, was widely seen as a victory for conservative ideas.  When it was passed, some opponents offered dire predictions that the law would make things worse for the poor.  But the number of people on welfare has plunged to 4.4 million, down 60 percent.  Employment of single mothers is up.  Child support collections have nearly doubled.

“We have been vindicated by the results,” said Representative E. Clay Shaw Jr., Republican of Florida and an architect of the 1996 law who was vilified at the time.  “Welfare reform was one of the most successful policy changes in our nation’s history.”

 

For the full story, see: 

ROBERT PEAR and ERIK ECKHOLM. "A Decade After Welfare Overhaul, a Shift in Policy and Perception." The New York Times (Mon., August 21, 2006):  A12.