Arthur Murray “America’s First Space Pilot,” RIP

MurrayArthurFirstSpacePilot2011-08-06.jpg

“Maj. Arthur Murray in 1954.” Source of caption and photo: online version of the NYT article quoted and cited below.

(p. A18) “I begin to feel weightless, and I’m flying so fast my instruments can’t keep up — they show what happened two miles ago. I’m climbing so steeply I can’t see the ground, and I feel confused. I have a sense of falling and I want to grab something for support.”

It was May 28, 1954, and Maj. Arthur Murray, test pilot, would wrestle for the next 15 terrifying seconds with a rocket plane racing over 1,400 miles an hour and spinning wildly, supersonically out of control. In the turmoil, he would fly higher than any human being had ever been, 90,440 feet over the earth.
Finally, Major Murray’s plane, a Bell X-1A, sank back into heavier air, and he had time to look at the dark blue sky and dazzling sunlight. He became the first human to see the curvature of the earth. At the time, he was called America’s first space pilot.
Arthur Murray, known as Kit, died on July 25, in a nursing home in the town of West in Texas, his family said. He was 92. He requested that his ashes be scattered over the Mojave Desert, where some of his fellow test pilots crashed and died.
Tom Wolfe marveled at the test pilots of Edwards Air Force Base in his 1979 book “The Right Stuff” exclaiming, “My God — to be part of Edwards in the late forties and early fifties!”

For the full obituary, see:
DOUGLAS MARTIN. “Arthur Murray, Test Pilot, Is Dead at 92.” The New York Times (Fri., August 5, 2011): A18.
(Note: the online version of the story is dated August 4, 2011.)

The wonderful Tom Wolfe book mentioned is:
Wolfe, Tom. The Right Stuff. New York: Farrar, Straus & Giroux, Inc., 1979.

At First, Some Feared Electricity

(p. 133) Something of the prevailing ambivalence was demonstrated by Mrs Cornelius Vanderbilt, who went to a costume ball dressed as an electric light to celebrate the installation of electricity in her Fifth Avenue home in New York, but then had the whole system taken out when it was suspected of being the source of a small fire. Others detected more insidious threats. One authority named S. F. Murphy identified a whole host of electrically induced maladies – eyestrain, headaches, general unhealthiness and possibly even ‘the premature exhaustion of life’. One architect was certain electric light caused freckles.

Source:
Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

The Movie Auteur as a Model for Technology Entrepreneurship

AuteurVersusCommittee2011-08-07.jpg Source of image: online version of the NYT article quoted and cited below.

(p. 3) Two years ago, the technology blogger John Gruber presented a talk, “The Auteur Theory of Design,” at the Macworld Expo. Mr. Gruber suggested how filmmaking could be a helpful model in guiding creative collaboration in other realms, like software.

The auteur, a film director who both has a distinctive vision for a work and exercises creative control, works with many other creative people. “What the director is doing, nonstop, from the beginning of signing on until the movie is done, is making decisions,” Mr. Gruber said. “And just simply making decisions, one after another, can be a form of art.”
“The quality of any collaborative creative endeavor tends to approach the level of taste of whoever is in charge,” Mr. Gruber pointed out.
Two years after he outlined his theory, it is still a touchstone in design circles for discussing Apple and its rivals.
Garry Tan, designer in residence and a venture partner at Y Combinator, an investor in start-ups, says: “Steve Jobs is not always right–MobileMe would be an example. But we do know that all major design decisions have to pass his muster. That is what an auteur does.”
Mr. Jobs has acquired a reputation as a great designer, Mr. Tan says, not because he personally makes the designs but because “he’s got the eye.” He has also hired classically trained designers like Jonathan Ive. “Design excellence also attracts design talent,” Mr. Tan explains.

For the full story, see:
RANDALL STROSS. “DIGITAL DOMAIN; The Auteur vs. the Committee.” The New York Times, SundayBusiness Section (Sun., July 24, 2011): 3.
(Note: the online version of the story is dated July 23, 2011.)

“Credentialing Gone Amok—In 20 Years, You’ll Need a Ph.D. to Be a Janitor”

(p. 17) Call it credential inflation. Once derided as the consolation prize for failing to finish a Ph.D. or just a way to kill time waiting out economic downturns, the master’s is now the fastest-growing degree.
. . .
“There is definitely some devaluing of the college degree going on,” says Eric A. Hanushek, an education economist at the Hoover Institution, and that gives the master’s extra signaling power. “We are going deeper into the pool of high school graduates for college attendance,” making a bachelor’s no longer an adequate screening measure of achievement for employers.
Colleges are turning out more graduates than the market can bear, and a master’s is essential for job seekers to stand out — that, or a diploma from an elite undergraduate college, says Richard K. Vedder, professor of economics at Ohio University and director of the Center for College Affordability and Productivity.
Not only are we developing “the overeducated American,” he says, but the cost is borne by the students getting those degrees. “The beneficiaries are the colleges and the employers,” he says. Employers get employees with more training (that they don’t pay for), and universities fill seats. In his own department, he says, a master’s in financial economics can be a “cash cow” because it draws on existing faculty (“we give them a little extra money to do an overload”) and they charge higher tuition than for undergraduate work. “We have incentives to want to do this,” he says. He calls the proliferation of master’s degrees evidence of “credentialing gone amok.” He says, “In 20 years, you’ll need a Ph.D. to be a janitor.”

For the full story, see:
LAURA PAPPANO. “The Master’s as the New Bachelor’s.” The New York Times, EducationLife Section (Sun., July 24, 2011): 16-17.
(Note: ellipsis added.)
(Note: the online version of the story is dated July 22, 2011.)

Political Ideology Matters in Hiring and Tenure

compromising-scholarship-religious-and-political-bias-in-american-higher-educationBK.jpg

Source of book image:
http://images.borders.com.au/images/bau/97816025/9781602582682/0/0/plain/compromising-scholarship-religious-and-political-bias-in-american-higher-education.jpg

(p. 34) . . . when a faculty committee is looking to hire or award tenure, political ideology seems to make a difference, according to a “collegiality survey” conducted by George Yancey.

Dr. Yancey, a professor of sociology at the University of North Texas, asked more than 400 sociologists which nonacademic factors might influence their willingness to vote for hiring a new colleague. You might expect professors to at least claim to be immune to bias in academic hiring decisions.
But as Dr. Yancey reports in his new book, “Compromising Scholarship: Religious and Political Bias in American Higher Education,” more than a quarter of the sociologists said they would be swayed favorably toward a Democrat or an A.C.L.U. member and unfavorably toward a Republican. About 40 percent said they would be less inclined to vote for hiring someone who belonged to the National Rifle Association or who was an evangelical. Similar results were obtained in a subsequent survey of professors in other social sciences and the humanities.

For the full commentary, see:
LAURA PAPPANO. “The Master’s as the New Bachelor’s.” The New York Times, EducationLife Section (Sun., July 24, 2011): 34.
(Note: ellipsis added.)
(Note: the online version of the commentary is dated July 22, 2011.)

Book mentioned:
Yancey, George. Compromising Scholarship; Religious and Political Bias in American Higher Education. Waco, TX: Baylor University Press, 2011.

Edison Excelled as an Organizer of Systems

(p. 131) Where Edison truly excelled was as an organizer of systems. The invention of the light bulb was a wondrous thing but of not much practical use when no one had a socket to plug it into. Edison and his tireless workers had to design and build the entire system from scratch, from power stations to cheap and reliable wiring, to lampstands and switches. Within months Edison had set up no fewer than 334 small electrical plants all over the world; (p. 132) within a year or so his plants were powering thirteen thousand light bulbs. Cannily he put them in places where they would be sure to make maximum impact: on the New York Stock Exchange, in the Palmer House Hotel in Chicago, La Scala opera house in Milan, the dining room of the House of Commons in London. Swan, meanwhile, was still doing much of his manufacturing in his own home. He didn’t, in short, have a lot of vision. Indeed, he didn’t even file for a patent. Edison took out patents everywhere, including in Britain in November 1879, and so secured his preeminence.

Source:
Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

China’s “Orwellian Surveillance System”

BeijingWebCafe2011-08-07.jpg “A customer in a Beijing cafe not yet affected by new regulations surfed the Web on Monday.” Source of caption and photo: online version of the NYT article quoted and cited below.

(p. A4) BEIJING — New regulations that require bars, restaurants, hotels and bookstores to install costly Web monitoring software are prompting many businesses to cut Internet access and sending a chill through the capital’s game-playing, Web-grazing literati who have come to expect free Wi-Fi with their lattes and green tea.

The software, which costs businesses about $3,100, provides public security officials the identities of those logging on to the wireless service of a restaurant, cafe or private school and monitors their Web activity. Those who ignore the regulation and provide unfettered access face a $2,300 fine and the possible revocation of their business license.
. . .
The new measures, it would appear, are designed to eliminate a loophole in “Internet management” as it is called, one that has allowed laptop- and iPad-owning college students and expatriates, as well as the hip and the underemployed, to while away their days at cafes and lounges surfing the Web in relative anonymity. It is this demographic that has been at the forefront of the microblogging juggernaut, one that has revolutionized how Chinese exchange information in ways that occasionally frighten officials.
. . .
One bookstore owner said she had already disconnected the shop’s free Wi-Fi, and not for monetary reasons. “I refuse to be part of an Orwellian surveillance system that forces my customers to disclose their identity to a government that wants to monitor how they use the Internet,” said the woman, who feared that disclosing her name or that of her shop would bring unwanted attention from the authorities.

For the full story, see:
ANDREW JACOBS. “China Steps Up Web Monitoring, Driving Many Wi-Fi Users Away.” The New York Times (Tues., July 26, 2011): A4.
(Note: ellipses added.)
(Note: the online version of the story is dated July 25, 2011.)

Natural Causes of Rapid Temperature Change

(p. C4) Some three decades after Laki, 1816 was known as the “year without a summer” thanks to a big eruption in Indonesia. Even Mount Pinatubo in the Philippines in 1991 caused a brief, though small, drop in world temperatures.

Other abrupt coolings have been bigger but less explicable. Earlier this year, two scientists from Brown University used lake sediments to conclude that the sharp cooling in Greenland during the late Middle Ages, which extinguished the Norse colonies, saw temperatures drop by seven degrees Fahrenheit in 80 years, much faster than recent warming there. Conversely, Greenland’s temperature shot up by around 13 degrees in 50 years as the world came out of the last ice age 12,000 years ago and the ice sheets of North America and northern Europe retreated–again, unlike today’s slow increase.

For the full commentary, see:
MATT RIDLEY. “MIND & MATTER; Will Volcanoes Cool Our Warming Earth?” The Wall Street Journal (Sat., AUGUST 6, 2011): C4.