From Self-Funding, and Sony, Khanna Builds PlayStation Supercomputer to Advance Science

KhannaGauravPlaystationSupercomputer2015-07-05.jpg“Gaurav Khanna with a supercomputer he built at the University of Massachusetts Dartmouth physics department using 200 Playstation 3 consoles that are housed in a refrigerated shipping container.” Source of caption: print version of the NYT article quoted and cited below. Source of photo: online version of the NYT article quoted and cited below.

(p. D3) This spring, Gaurav Khanna noticed that the University of Massachusetts Dartmouth physics department was more crowded than usual. Why, he wondered, were so many students suddenly so interested in science?”

It wasn’t a thirst for knowledge, it turns out. News of Dr. Khanna’s success in building a supercomputer using only PlayStation 3 video game consoles had spread quickly; the students, a lot of them gamers, just wanted to gape at the sight of nearly 200 consoles stacked on one another.
. . .
Making a supercomputer requires a large number of processors — standard desktops, laptops or the like — and a way to network them. Dr. Khanna picked the PlayStation 3 for its viability and cost, currently, $250 to $300 in stores. Unlike other game consoles, the PlayStation 3 allows users to install a preferred operating system, making it attractive to programmers and developers. (The latest model, the PlayStation 4, does not have this feature.)
“Gaming had grown into a huge market,” Dr. Khanna said. “There’s a huge push for performance, meaning you can buy low-cost, high-performance hardware very easily. I could go out and buy 100 PlayStation 3 consoles at my neighborhood Best Buy, if I wanted.”
That is just what Dr. Khanna did, though on a smaller scale. Because the National Science Foundation, which funds much of Dr. Khanna’s research, might not have viewed the bulk buying of video game consoles as a responsible use of grant money, he reached out to Sony Computer Entertainment America, the company behind the PlayStation 3. Sony donated four consoles to the experiment; Dr. Khanna’s university paid for eight more, and Dr. Khanna bought another four. He then installed the Linux operating system on all 16 consoles, plugged them into the Internet and booted up the supercomputer.
Lior Burko, an associate professor of physics at Georgia Gwinnett College and a past collaborator with Dr. Khanna, praised the idea as an “ingenious” way to get the function of a supercomputer without the prohibitive expense.
“Dr. Khanna was able to combine his two fields of expertise, namely general relativity and computer science, to invent something new that allowed for not just a neat new machine, but also scientific progress that otherwise might have taken many more years to achieve,” Dr. Burko said.
. . .
His team linked the consoles, housing them in a refrigerated shipping container designed to carry milk. The resulting supercomputer, Dr. Khanna said, had the computational power of nearly 3,000 laptop or desktop processors, and cost only $75,000 to make — about a tenth the cost of a comparable supercomputer made using traditional parts.

For the full story, see:
LAURA PARKER “An Economical Way to Save Progress.” The New York Times (Tues., DEC. 23, 2014): D3.
(Note: ellipses added.)
(Note: the online version of the story has the date DEC. 22, 2014, and has the title “That Old PlayStation Can Aid Science.”)

No Increase in Public’s Concern with Income Inequality Since 1978

(p. 4A) DENVER (AP) — Income inequality is all the rage in public debate nowadays. Political figures from Sen. Elizabeth Warren on the left to Republican presidential prospect Jeb Bush on the right are denouncing the widening gap between the wealthy and everyone else.
But ordinary Americans don’t seem as fascinated by the issue as their would-be leaders. The public’s expressed interest in income inequality has remained stagnant over the past 36 years, according to the General Social Survey, which measures trends in public opinion.
In 2014 polling, Republicans’ support for the government doing something to narrow the rich-poor gap reached an all-time low. Even Democrats were slightly less interested in government action on the issue than they were two years ago.
The survey is conducted by the independent research organization NORC at the University of Chicago. Because of its long-running and comprehensive questions, it is a highly regarded source on social trends.
In the latest survey, made public last week, less than half of Americans — 46 percent — said the government ought to reduce income differences between the rich and the poor. That level has held fairly steady since 1978. Thirty-seven percent said the government shouldn’t concern itself with income differences, and the rest didn’t feel strongly either way.

For the full story, see:
AP. “Income Inequality? Pols Want to Talk about It; Public Yawns.” Omaha World-Herald (Monday, March 23, 2015): 4A.

For more details on the National Opinion Research Center (NORC) General Social Survey (GSS) results through 2014, see:
Inequality: Trends in Americans’ Attitudes URL: http://www.apnorc.org/projects/Pages/HTML%20Reports/inequality-trends-in-americans-attitudes0317-6562.aspx#study

Pentagon Seeks Innovation from Private Start-Ups Since “They’ve Realized that the Old Model Wasn’t Working Anymore”

(p. A3) SAN FRANCISCO — A small group of high-ranking Pentagon officials made a quiet visit to Silicon Valley in December to solicit national security ideas from start-up firms with little or no history of working with the military.
The visit was made as part of an effort to find new ways to maintain a military advantage in an increasingly uncertain world.
In announcing its Defense Innovation Initiative in a speech in California in November, Chuck Hagel, then the defense secretary, mentioned examples of technologies like robotics, unmanned systems, miniaturization and 3-D printing as places to look for “game changing” technologies that would maintain military superiority.
“They’ve realized that the old model wasn’t working anymore,” said James Lewis, director of the Strategic Technologies Program at the Center for Strategic and International Studies in Washington. “They’re really worried about America’s capacity to innovate.”
There is a precedent for the initiative. Startled by the Soviet launch of the Sputnik satellite in 1957, President Dwight D. Eisenhower created the Advanced Research Projects Agency, or ARPA, at the Pentagon to ensure that the United States would not be blindsided by technological advances.
Now, the Pentagon has decided that the nation needs more than ARPA, renamed the Defense Advanced Research Projects Agency, or Darpa, if it is to find new technologies to maintain American military superiority.
. . .
The Pentagon focused on smaller companies during its December visit; it did not, for example, visit Google. Mr. Welby acknowledged that Silicon Valley start-ups were not likely to be focused on the Pentagon as a customer. The military has captive suppliers and a long and complex sales cycle, and it is perceived as being a small market compared with the hundreds of millions of customers for consumer electronics products.
Mr. Welby has worked for three different Darpa directors, but he said that Pentagon officials now believed they had to look beyond their own advanced technology offices.
“The Darpa culture is about trying to understand high-risk technology,” he said. “It’s about big leaps.” Today, however, the Pentagon needs to break out of what can be seen as a “not invented here” culture, he said.
“We’re thinking about what the world is going to look like in 2030 and what tools the department will need in 20 or 30 years,” he added.

For the full story, see:
JOHN MARKOFF. “Pentagon Shops in Silicon Valley for Game Changers.” The New York Times (Fri., FEB. 27, 2015): A3.
(Note: ellipsis added.)
(Note: the online version of the story has the date FEB. 26, 2015.)

Starting in Late Middle Ages the State Tried “to Control, Delineate, and Restrict Human Thought and Action”

(p. C6) . . . transregional organizations like Viking armies or the Hanseatic League mattered more than kings and courts. It was a world, as Mr. Pye says, in which “you went where you were known, where you could do the things you wanted to do, and where someone would protect you from being jailed, hanged, or broken on the wheel for doing them.”
. . .
This is a world in which money rules, but money is increasingly an abstraction, based on insider information, on speculation (the Bourse or stock market itself is a regional invention) and on the ability to apply mathematics: What was bought or sold was increasingly the relationships between prices in different locations rather than the goods themselves.
What happened to bring this powerful, creative pattern to a close? The author credits first the reaction to the Black Death of the mid-14th century, when fear of contamination (perhaps similar to our modern fear of terrorism) justified laws that limited travel and kept people in their place. Religious and sectarian strife further limited the free flow of ideas and people, forcing people to choose one identity to the exclusion of others or else to attempt to disappear into the underground of clandestine and subversive activities. And behind both of these was the rise of the state, a modern invention that attempted to control, delineate, and restrict human thought and action.

For the full review, see:
PATRICK J. GEARY. “Lighting Up the Dark Ages.” The Wall Street Journal (Sat., May 30, 2015): C6.
(Note: ellipses added.)
(Note: the online version of the review has the date May 29, 2015.)

The book under review, is:
Pye, Michael. The Edge of the World: A Cultural History of the North Sea and the Transformation of Europe. New York: Pegasus Books LLC, 2014.

More Tech Stars Skip College, at Least for a While

(p. B1) The college dropout-turned-entrepreneur is a staple of Silicon Valley mythology. Steve Jobs, Bill Gates and Mark Zuckerberg all left college.
In their day, those founders were very unusual. But a lot has changed since 2005, when Mr. Zuckerberg left Harvard. The new crop of dropouts has grown up with the Internet and smartphones. The tools to create new technology are more accessible. The cost to start a company has plunged, while the options for raising money have multiplied.
Moreover, the path isn’t as lonely.
. . .
Not long ago, dropping out of school to start a company was considered risky. For this generation, it is a badge of honor, evidence of ambition and focus. Very few dropouts become tycoons, but “failure” today often means going back to school or taking a six-figure job at a big tech company.
. . .
(p. B5) There are no hard numbers on the dropout trend, but applicants for the Thiel Fellowship tripled in the most recent year; the fellowship won’t disclose numbers.
. . .
It has tapped 82 fellows in the past five years.
“I don’t think college is always bad, but our society seems to think college is always good, for everyone, at any cost–and that is what we have to question,” says Mr. Thiel, a co-founder of PayPal and an early investor in Facebook.
Of the 43 fellows in the initial classes of 2011 and 2012, 26 didn’t return to school and continued to work on startups or independent projects. Five went to work for large tech firms, including a few through acquisitions. The remaining 12 went back to school.
Mr. Thiel says companies started by the fellows have raised $73 million, a record that he says has attracted additional applicants. He says fellows “learned far more than they would have in college.”

For the full story, see:
DAISUKE WAKABAYASHI. “College Dropouts Thrive in Tech.” The Wall Street Journal (Thurs., June 4, 2015): B1 & B10.
(Note: ellipses added. The phrase “the fellowship won’t disclose numbers” was in the online, but not the print, version of the article.)
(Note: the online version of the article has the date June 3, 2015, and has the title “College Dropouts Thrive in Tech.”)

The Complementarity of Humans and Robots in Education

(p. 6) Computers and robots are already replacing many workers. What can young people learn now that won’t be superseded within their lifetimes by these devices and that will secure them good jobs and solid income over the next 20, 30 or 50 years? In the universities, we are struggling to answer that question.
. . .
Some scholars are trying to discern what kinds of learning have survived technological replacement better than others. Richard J. Murnane and Frank Levy in their book “The New Division of Labor” (Princeton, 2004) studied occupations that expanded during the information revolution of the recent past. They included jobs like service manager at an auto dealership, as opposed to jobs that have declined, like telephone operator.
The successful occupations, by this measure, shared certain characteristics: People who practiced them needed complex communication skills and expert knowledge. Such skills included an ability to convey “not just information but a particular interpretation of information.” They said that expert knowledge was broad, deep and practical, allowing the solution of “uncharted problems.”
. . .
When I arrived at Yale in 1982, there were no undergraduate courses in finance. I started one in the fall of 1985, and it continues today. Increasingly, I’ve tried to connect mathematical theory to actual applications in finance.
Since its beginnings, the course has gradually become more robotic: It resembles a real, dynamic, teaching experience, but in execution, much of it is prerecorded, and exercises and examinations are computerized. Students can take it without need of my physical presence. Yale made my course available to the broader public on free online sites: AllLearn in 2002, Open Yale in 2008 and 2011, and now on Coursera.
The process of tweaking and improving the course to fit better in a digital framework has given me time to reflect about what I am doing for my students. I could just retire now and let them watch my lectures and use the rest of the digitized material. But I find myself thinking that I should be doing something more for them.
So I continue to update the course, thinking about how I can integrate its lessons into an “art of living in the world.” I have tried to enhance my students’ sense that finance should be the art of financing important human activities, of getting people (and robots someday) working together to accomplish things that we really want done.

For the full commentary, see:
ROBERT J. SHILLER. “Economic View; What to Learn in College to Stay One Step Ahead of Computers.” The New York Times, SundayBusiness Section (Sun., MAY 24, 2015): 6.
(Note: ellipses added.)
(Note: the online version of the commentary has the date MAY 22, 2015, and has the title “Economic View; What to Learn in College to Stay One Step Ahead of Computers.”)

The Levy and Murnane book mentioned above, is:
Levy, Frank, and Richard J. Murnane. The New Division of Labor: How Computers Are Creating the Next Job Market. Princeton, NJ: Princeton University Press, 2004.
Some of the core of the Levy and Murnane book can be found in:
Levy, Frank, and Richard Murnane. “Book Excerpt: The New Division of Labor.” Milken Institute Review 6, no. 4 (Dec. 2004): 61-82.

“Buy Local” Inefficiently Wastes Resources

(p. 8) Much is . . . made about the eco-friendliness of handmade.
“Buying handmade (especially really locally) can greatly reduce your carbon footprint on the world,” reads a post on the popular website Handmadeology.
But few economists give much credence to the idea that buying local necessarily saves energy. Most believe that the economies of scale inherent in mass production outweigh the benefits of nearness. These same economies of scale most likely make a toothbrush factory less wasteful, in terms of materials, than 100 individual toothbrush makers each handcrafting 10 toothbrushes a day.

For the full commentary, see:
EMILY MATCHA. “OPINION; It’s Chic. Not Morally Superior. That Handmade Scarf Won’t Save the World.” The New York Times, SundayReview Section (Sun., MAY 3, 2015): 8.
(Note: ellipses added.)
(Note: the online version of the coomentary has the date MAY 2, 2015, and has the title “OPINION; Sorry, Etsy. That Handmade Scarf Won’t Save the World.”)

Spread of Robots Creates New and Better Human Jobs

(p. A11) The issues at the heart of “Learning by Doing” come into sharp relief when James Bessen visits a retail distribution center near Boston that was featured on “60 Minutes” two years ago. The TV segment, titled “Are Robots Hurting Job Growth?,” combined gotcha reporting with vintage movie clips–scary-looking Hollywood robots–to tell a chilling tale of human displacement and runaway job loss.
Mr. Bessen isn’t buying it. Although robots at the distribution center have eliminated some jobs, he says, they have created others–for production workers, technicians and managers. The problem at automated workplaces isn’t the robots. It’s the lack of qualified workers. New jobs “require specialized skills,” Mr. Bessen writes, but workers with these skills “are in short supply.”
It is a deeply contrarian view. The conventional wisdom about robots and other new workplace technology is that they do more harm than good, destroying jobs and hollowing out the middle class. MIT economists Erik Brynjolfsson and Andrew McAfee made the case in their best-selling 2014 book, “The Second Machine Age.” They describe a future in which software-driven machines will take over not just routine jobs–replacing clerks, cashiers and warehouse workers–but also tasks done by nurses, doctors, lawyers and stock traders. Mr. Bessen sets out to refute the arguments of such techno-pessimists, relying on economic analysis and on a fresh reading of history.

For the full review, see:
TAMAR JACOBY. “BOOKSHELF; Technology Isn’t a Job Killer; Many predicted ATMs would eliminate bank tellers, but the number of tellers in the U.S. has risen since the machines were introduced.” The Wall Street Journal (Thurs., May 21, 2015): A11.
(Note: the online version of the review has the date May 20, 2015.)

The book under review, is:
Bessen, James. Learning by Doing: The Real Connection between Innovation, Wages, and Wealth. New Haven, CT: Yale University Press, 2015.

Computer Programs “Lack the Flexibility of Human Thinking”

(p. A11) . . . let’s not panic. “Superintelligent” machines won’t be arriving soon. Computers today are good at narrow tasks carefully engineered by programmers, like balancing checkbooks and landing airplanes, but after five decades of research, they are still weak at anything that looks remotely like genuine human intelligence.
. . .
Even the best computer programs out there lack the flexibility of human thinking. A teenager can pick up a new videogame in an hour; your average computer program still can only do just the single task for which it was designed. (Some new technologies do slightly better, but they still struggle with any task that requires long-term planning.)

For the full commentary, see:
GARY MARCUS. “Artificial Intelligence Isn’t a Threat–Yet; Superintelligent machines are still a long way off, but we need to prepare for their future rise.” The Wall Street Journal (Sat., Dec. 13, 2014): A11.
(Note: ellipsis added.)
(Note: the online version of the commentary has the date Dec. 11, 2014.)

Cultural and Institutional Differences Between Europe and U.S. Keep Europe from Having a Silicon Valley

(p. B7) “They all want a Silicon Valley,” Jacob Kirkegaard, a Danish economist and senior fellow at the Peterson Institute for International Economics, told me this week. “But none of them can match the scale and focus on the new and truly innovative technologies you have in the United States. Europe and the rest of the world are playing catch-up, to the great frustration of policy makers there.”
Petra Moser, assistant professor of economics at Stanford and its Europe Center, who was born in Germany, agreed that “Europeans are worried.”
“They’re trying to recreate Silicon Valley in places like Munich, so far with little success,” she said. “The institutional and cultural differences are still too great.”
. . .
There is . . . little or no stigma in Silicon Valley to being fired; Steve Jobs himself was forced out of Apple. “American companies allow their employees to leave and try something else,” Professor Moser said. “Then, if it works, great, the mother company acquires the start-up. If it doesn’t, they hire them back. It’s a great system. It allows people to experiment and try things. In Germany, you can’t do that. People would hold it against you. They’d see it as disloyal. It’s a very different ethic.”
Europeans are also much less receptive to the kind of truly disruptive innovation represented by a Google or a Facebook, Mr. Kirkegaard said.
He cited the example of Uber, the ride-hailing service that despite its German-sounding name is a thoroughly American upstart. Uber has been greeted in Europe like the arrival of a virus, and its reception says a lot about the power of incumbent taxi operators.
“But it goes deeper than that,” Mr. Kirkegaard said. “New Yorkers don’t get all nostalgic about yellow cabs. In London, the black cab is seen as something that makes London what it is. People like it that way. Americans tend to act in a more rational and less emotional way about the goods and services they consume, because it’s not tied up with their national and regional identities.”
. . .
With its emphasis on early testing and sorting, the educational system in Europe tends to be very rigid. “If you don’t do well at age 18, you’re out,” Professor Moser said. “That cuts out a lot of people who could do better but never get the chance. The person who does best at a test of rote memorization at age 17 may not be innovative at 23.” She added that many of Europe’s most enterprising students go to the United States to study and end up staying.
She is currently doing research into creativity. “The American education system is much more forgiving,” Professor Moser said. “Students can catch up and go on to excel.”
Even the vaunted European child-rearing, she believes, is too prescriptive. While she concedes there is as yet no hard scientific evidence to support her thesis, “European children may be better behaved, but American children may end up being more free to explore new things.”

For the full story, see:
JAMES B. STEWART. “Common Sense; A Fearless Culture Fuels Tech.” The New York Times (Fri., JUNE 19, 2015): B1 & B7.
(Note: ellipses added.)
(Note: the online version of the story has the date JUNE 18, 2015, and has the title “Common Sense; A Fearless Culture Fuels U.S. Tech Giants.”)

“The Great Fact” of “the Ice-Hockey Stick”

(p. 2) Economic history has looked like an ice-hockey stick lying on the ground. It had a long, long horizontal handle at $3 a day extending through the two-hundred-thousand-year history of Homo sapiens to 1800, with little bumps upward on the handle in ancient Rome and the early medieval Arab world and high medieval Europe, with regressions to $3 afterward–then a wholly unexpected blade, leaping up in the last two out of the two thousand centuries, to $30 a day and in many places well beyond.
. . .
(p. 48) The heart of the matter is sixteen. Real income per head nowadays exceeds that around 1700 or 1800 in, say, Britain and in other countries that have experienced modern economic growth by such a large factor as sixteen, at least. You, oh average participant in the British economy, go through at least sixteen times more food and clothing and housing and education in a day than an ancestor of yours did two or three centuries ago. Not sixteen percent more, but sixteen multiplied by the old standard of living. You in the American or the South Korean economy, compared to the wretchedness of former Smiths in 1653 or Kims in 1953, have done even better. And if such novelties as jet travel and vitamin pills and instant messaging are accounted at their proper value, the factor of material improvement climbs even higher than sixteen–to eighteen, or thirty, or far beyond. No previous episode of enrichment for the average person approaches it, not the China of the Song Dynasty or the Egypt of the New Kingdom, not the glory of Greece or the grandeur of Rome.
No competent economist, regardless of her politics, denies the Great Fact.

Source:
McCloskey, Deirdre N. Bourgeois Dignity: Why Economics Can’t Explain the Modern World. Chicago: University of Chicago Press, 2010.
(Note: ellipsis added.)