By at Least 50,000 Years Ago Homo Sapiens “Developed the Full Battery of Cognitive Skills that We Ourselves Possess”

Before the passage quoted below, Fagan briefly discusses the two probable waves of humans spreading out from Africa, the first of which is believed to have occurred about 100,000 years ago.

(p. 10) A second, even less well-documented push seems to have taken place later, around fifty thousand years ago. This time, moderns settled throughout Near East Asia and stayed there, apparently living alongside a sparse Neanderthal population. This widely accepted theory assumes that by this rime the newcomers had all the intellectual capabilities of Homo sapiens. Just when and how they acquired them remains a major unsolved problem. All we can say is that at some point between one hundred thousand and fifty thousand years (p. 11) ago, at a seminal yet still little known moment in history Homo sapiens developed the full battery of cognitive skills that we ourselves possess.

Source:
Fagan, Brian. Cro-Magnon: How the Ice Age Gave Birth to the First Modern Humans. New York: Bloomsbury Press, 2010.
(Note: italics in original.)

Jeff Bezos’ Goal: “Earth’s Biggest Selection”

BezonJeff2010-08-29.jpg

Jeff Bezos. Source of photo: online version of the NYT article quoted and cited below.

(p. 18) You’re a longtime science buff who studied electrical engineering and computer science at Princeton. Why did you want to be a bookseller in the first place?
You have to go back in time to 1994, and there’s something very unusual about the book category. There are more items in the book category than there are items in any other product category. One of the things it was obvious you could do with an online store is have a much more complete selection.

Initially, Amazon sold books exclusively, but it has since expanded into a retail omnivore that sells basketballs and vacuum cleaners and hamster food and everything under the sun. What is your goal, exactly?
We want to have earth’s biggest selection. Earth’s biggest river, earth’s biggest selection.

For the full interview, see:
DEBORAH SOLOMON. “QUESTIONS FOR Jeffrey P. Bezos; Book Learning.” The New York Times, Magazine Section (Sun., December 6, 2009): 18.

(Note: bold in original, to indicate questions by Deborah Solomon.)
(Note: the online version of the interview is dated December 2, 2009.)

.

Environmentalist Blue Planet Prize Winner Lovelock Endorsed Nuclear Power

LovelockJames2010-09-01.jpg

“The scientist James E. Lovelock during an interview at the Algonquin Hotel in New York.” Source of caption and photo: online version of the NYT article quoted and cited below.

(p. D2) Few scientists have elicited such equivalent heaps of praise and criticism as James E. Lovelock, the British chemist, inventor and planetary diagnostician who has long foreseen a clash between humans and their planet.

His work underpins much of modern environmentalism. The electron capture detector he invented in the 1950’s produced initial measurements of dispersed traces of pesticides and ozone-destroying chlorofluorocarbons, providing a foundation for the work of Rachel Carson and for studies revealing risks to the atmosphere’s protective ozone layer.
His conception in 1972 of the planet’s chemistry, climate and veneer of life as a self-sustaining entity, soon given the name Gaia, was embraced by the Earth Day generation and was ridiculed, but eventually accepted (with big qualifications), by many biologists.
Dr. Lovelock, honored in 1997 with the Blue Planet Prize, which is widely considered the environmental equivalent of a Nobel award, has now come under attack from some environmentalists for his support of nuclear power as a way to avoid runaway “global heating” — his preferred alternative to “global warming.”
In his latest book, “The Revenge of Gaia: Why the Earth Is Fighting Back — and How We Can Still Save Humanity” (Perseus, 2006), Dr. Lovelock says that any risks posed by nuclear power are small when compared with the “fever” of heat-trapping carbon dioxide produced by burning coal, oil and other fossil fuels.

For the full interview, see:
ANDREW C. REVKIN. “A Conversation With James E. Lovelock; Updating Prescriptions for Avoiding Worldwide Catastrophe.” The New York Times, Science Times Section (Tues., September 12, 2006): D2.

“Modern” Humans Have Existed for at Least 100,000–and Maybe 200,000–Years

(p. 9) A group of geneticists headed by Rebecca Cann and Alan Wilson, using mtDNA and a sophisticated “molecular clock,” traced modern-human ancestry back to isolated African populations dating to between two hundred thousand and one hundred thousand years ago. Inevitably there was talk of an “African Eve,” a first modern woman, the hypothetical ancestor of all modern humankind. Most archaeologists gulped and took a deep breath. Cairn and her colleagues had taken Homo sapiens into new and uncharted historical territory.
. . .
(p. 10) The genetic case for an African origin for Homo sapiens seems overwhelming. The archaeologists have also stepped forward with new fossil discoveries, including a robust 195,000-year-old modern human from Omo Kibish, in Ethiopia, and three 160,000-year-old Homo sapiens skulls from Herto, also in Ethiopia. Few anthropologists now doubt that Africa was the cradle of Homo sapiens and home to the remotest ancestors of the first modern Europeans–the Cro-Magnons. The seemingly outrageous chronology of two decades ago is now accepted as historical reality.

Source:
Fagan, Brian. Cro-Magnon: How the Ice Age Gave Birth to the First Modern Humans. New York: Bloomsbury Press, 2010.
(Note: ellipsis added; italics in original.)

Cro-Magnon Provides Baseline to Measure Our Progress

Cro-MagnonBK.jpg

Source of book image:
http://ecx.images-amazon.com/images/I/51BS%2BtGJZ8L.jpg

Biologically modern humans have inhabited the world for at least 50,000 years, and maybe for 100,000 years or more.
Only in the last 200 years, and especially the last 100 years, has humanity made substantial progress in the quality and quantity of life.
Usually the most recent 200 years are compared with the previous few thousand, because conditions in the previous few thousand years are much better known than those in the tens of thousands of years further in the past.
But comparisons further back are of interest, and Brian Fagan’s book Cro-Magnon is a source of some information that allows us to do so to some extent.
In the next few weeks, I will occasionally be quoting a few passages from Fagan that I believe are suggestive.

The reference for the Fagan book is:
Fagan, Brian. Cro-Magnon: How the Ice Age Gave Birth to the First Modern Humans. New York: Bloomsbury Press, 2010.

Wozniak “Lucky” to Be Young “Just as a Revolution Is About to Take Off”

(p. 299) If you’re as lucky as I’ve been, then you’ll get to live in a time when you’re young just as a revolution is about to take off. Just like Henry Ford was there for the automotive industry, I was there to see and build the first personal computers.

Back in the mid-1990s when I was teaching school, I thought one time to myself, Wow, I wish I could be twelve now, look at the things I could do with what’s out there now.
(p. 300) But then I realized I was lucky. I got to see the before, the during, and the after of some of those changes in life. I got to be one of those few people who could effect some of those changes.
Excellence came to me from not having much money, and also from having good building skills but not having done these products before.
I hope you’ll be as lucky as I am. The world needs inventors–great ones. You can be one. If you love what you do and are willing to do what it takes, it’s within your reach. And it’ll be worth every minute you spend alone at night, thinking and thinking about what it is you want to design or build. It’ll be worth it, I promise.

Source:
Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

“The Survival of Freedom and Accountable, Limited Government Is an Enormously Important Value”

GellnerErnest2010-08-05.jpg “Ernest Gellner in his office at the London School of Economics in 1979.” Source of caption and photo: online version of the WSJ article quoted and cited below.

(p. W8) ‘I am sorry, I have written another,” Ernest Gellner used to say in his later years before publishing a new book. “I just couldn’t help it.” Not even his death in 1995 stopped the flow. The last of his posthumous works, “Language and Solitude,” appeared in the late 1990s. Now Gellner has been brought back to life–alongside his combative ideas and his maverick approach to intellectual combat–in a sympathetic but by no means reverential biography by his former pupil John A. Hall.
. . .
Many of the problems that Gellner addressed during his long intellectual career–such as the roots of nationalism and the role of contemporary Islam–are obviously of direct relevance today. But the most pertinent part of his legacy lies in his fearless endorsement of Western modernity at a time when it was becoming increasingly embattled in the academy and elsewhere.
As Mr. Hall demonstrates, Gellner believed that there really was a clash between “liberty and pluralism,” on the one hand, and “authoritarianism and oppressiveness” on the other. In a passionate riposte to Noam Chomsky, who had accused him of ignoring Western crimes, Gellner charged that his critic had “obscured” the fact that “the survival of freedom and accountable, limited government is an enormously important value even when some of its defenders are occasionally tarnished.”
This was the authentic voice of Ernest Gellner: honest, cool and reasonable. Mr. Hall is to be congratulated for reminding us of how much we miss it today.

For the full review, see:
BRENDAN SIMMS. “A Combatant in the Battle of Ideas; A defender of the West when it was most embattled, a defender of reason at a time of dangerous irrationality.” The Wall Street Journal (Fri., JULY 23, 2010): W8.
(Note: the online version of the article is dated July 30 (sic), 2010.)
(Note: ellipsis added.)

The book under review, is:
Hall, John A. Ernest Gellner: An Intellectual Biography. London, UK: Verso, 2010.

ErnestGellnerBK.jpg

Source of book image: online version of the WSJ review quoted and cited above.

Wozniak on Borrowing Xerox Parc’s Graphical User Interface (GUI)

(p. 293) But there was one exception. Right around 1980, Steve and a bunch of us from Apple got to tour the Xerox Palo Alto Research Center (PARC) facility, which is one of Xerox’s research and development labs.

Inside, for the first time ever, we saw real video displays–computer monitors–and they were showing something entirely new They were showing the first graphical user interface (GUI)–art interface that lets you interact with icons and menus to control a program.
(p. 294) Up to this point, everything had been text-based. That’s going to sound odd to all the people who don’t remember it, but that’s how everything worked back then. A computer user had to actually type in text commands–long, complicated ones–to make something happen.
But this experimental Xerox computer had windows popping up all over the place. And they were using this funny-looking device everyone now knows as a mouse, clicking on words and small pictures, the icons, to make things happen.
The minute I saw this interface, I knew it was the future. There wasn’t a doubt in my mind. It was like a one-way door to the future–and once you went through it, you could never turn back. It was such a huge improvement in using computers. The GUI meant you could get a computer to do the same things it could normally do, but with much less physical and mental effort. It meant that nontechnical people could do some pretty powerful things with computers without having to sit there and learn how to type in long commands. Also, it let several different programs run in separate windows at the same time. That was powerful!
A few years later, Apple designed the Lisa computer, and later the Macintosh, around this concept. And Microsoft did it a couple years after that with Microsoft Windows. And now, more than twenty-five years after we saw that experimental computer in the Xerox PARC lab, all computers work like this.
It’s so rare to be able to see the future like that. I can’t promise it’ll happen to you. But when you see it, you know it. If this ever happens to you, leap at the chance to get involved. Trust your instincts. It isn’t often that the future lets you in like that.

Source:
Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

Wozniak Could Only Predict a Year or Two Ahead in Technology

(p. 293) If you could easily predict the future, inventing things would be a lot easier! Predicting the future is difficult even if you’re involved with products that are guiding computers, the way we were at Apple.

When I was at Apple in the l970s and 1980s, we would always try to look ahead and see where things were going. It was actually easy to see a year or two ahead, because we were the ones building the products and had all these contacts at other companies. But beyond that, it was tough to see. The only thing we could absolutely rely upon had to do with Moore’s Law–the now-famous rule in electronics (named for Intel founder Gordon Moore) that says that every eighteen months you can pack twice the number of transistors on a chip.
That meant computers could keep getting smaller and cheaper. We saw that. But we had a hard time imagining what kinds of applications could take advantage of all this power. We didn’t expect high-speed modems. We didn’t expect computers to have large amounts of hard-disk storage built in. We didn’t see the Internet growing out of the ARPANET and becoming accessible to everyone. Or digital cameras. We didn’t see any of that. We really could only see what was right in front of us, a year or two out, max.

Source:
Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

Inventors Should Work Alone, Even If They Have to Moonlight

(p. 291) If you’re that rare engineer who’s an inventor and also an artist, I’m going to give you some advice that might be hard to take. That advice is: Work alone.

When you’re working for a large, structured company, there’s much less leeway to turn clever ideas into revolutionary new products or product features by yourself. Money is, unfortunately, a god in our society, and those who finance your efforts are businesspeople with lots of experience at organizing contracts that define who owns what and what you can do on your own.
But you probably have little business experience, know-how, or acumen, and it’ll be hard to protect your work or deal with all that corporate nonsense. I mean, those who provide the funding and tools and environment are often perceived as taking the credit for inventions. If you’re a young inventor who wants to change the world, a corporate environment is the wrong place for you.
(p. 292) You’re going to be best able to design revolutionary products and features if you’re working on your own. Not on a committee. Not on a team. That means you’re probably going to have to do what I did. Do your projects as moonlighting, with limited money and limited resources. But man, it’ll be worth it in the end. It’ll be worth it if this is really, truly what you want to do–invent things. If you want to invent things that can change the world, and not just work at a corporation working on other people’s inventions, you’re going to have to work on your own projects.
When you’re working as your own boss, making decisions about what you’re going to build and how you’re going to go about it, making trade-offs as to features and qualities, it becomes a part of you. Like a child you love and want to support. You have huge motivation to create the best possible inventions–and you care about them with a passion you could never feel about an invention someone else ordered you to come up with.
And if you don’t enjoy working on stuff for yourself–with your own money and your own resources, after work if you have to– then you definitely shouldn’t be doing it!

. . .

It’s so easy to doubt yourself, and it’s especially easy to doubt yourself when what you’re working on is at odds with everyone else in the world who thinks they know the right way to do things. Sometimes you can’t prove whether you’re right or wrong. Only time can tell that. But if you believe in your own power to objectively reason, that’s a key to happiness. And a key to confidence. Another key I found to happiness was to realize that I didn’t have to disagree with someone and let it get all intense. If you believe in your own power to reason, you can just relax. You don’t have to feel the pressure to set out and convince anyone. So don’t sweat it! You have to trust your own designs, your own intuition, and your own understanding of what your invention needs to be.

Source:
Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.
(Note: Italics and centered ellipsis in original.)

Documenting Dangers of Growing Public Debt (and of Replacing History with Math)

RogoffReinhart2010-08-04.jpg “Kenneth Rogoff and Carmen Reinhart at Ms. Reinhart’s Washington home. They started their book around 2003, years before the economy began to crumble.” Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 1) Like a pair of financial sleuths, Ms. Reinhart and her collaborator from Harvard, Kenneth S. Rogoff, have spent years investigating wreckage scattered across documents from nearly a millennium of economic crises and collapses. They have wandered the basements of rare-book libraries, riffled through monks’ yellowed journals and begged central banks worldwide for centuries-old debt records. And they have manually entered their findings, digit by digit, into one of the biggest spreadsheets you’ve ever seen.

Their handiwork is contained in their recent best seller, “This Time Is Different,” a quantitative reconstruction of hundreds of historical episodes in which perfectly smart people made perfectly disastrous decisions. It is a panoramic opus, both geographically and temporally, covering crises from 66 countries over the last 800 years.
The book, and Ms. Reinhart’s and Mr. Rogoff’s own professional journeys as economists, zero in on some of the broader shortcomings of their trade — thrown into harsh relief by economists’ widespread failure to anticipate or address the financial crisis that began in 2007.
“The mainstream of academic research in macroeconomics puts theoretical coherence and elegance first, and investigating the data second,” says Mr. Rogoff. For that reason, he says, much of the profession’s celebrated work “was not terribly useful in either predicting the financial crisis, or in assessing how it would it play out once it happened.”
“People almost pride themselves on not paying attention to current events,” he says.
. . .
(p. 6) Although their book is studiously nonideological, and is more focused on patterns than on policy recommendations, it has become fodder for the highly charged debate over the recent growth in government debt.
To bolster their calls for tightened government spending, budget hawks have cited the book’s warnings about the perils of escalating public and private debt. Left-leaning analysts have been quick to take issue with that argument, saying that fiscal austerity perpetuates joblessness, and have been attacking economists associated with it.
. . .
The economics profession generally began turning away from empirical work in the early 1970s. Around that time, economists fell in love with theoretical constructs, a shift that has no single explanation. Some analysts say it may reflect economists’ desire to be seen as scientists who describe and discover universal laws of nature.
“Economists have physics envy,” says Richard Sylla, a financial historian at the Stern School of Business at New York University. He argues that Paul Samuelson, the Nobel laureate whom many credit with endowing economists with a mathematical tool kit, “showed that a lot of physical theories and concepts had economic analogs.”
Since that time, he says, “economists like to think that there is some physical, stable state of the world if they get the model right.” But, he adds, “there is really no such thing as a stable state for the economy.”
Others suggest that incentives for young economists to publish in journals and gain tenure predispose them to pursue technical wizardry over deep empirical research and to choose narrow slices of topics. Historians, on the other hand, are more likely to focus on more comprehensive subjects — that is, the material for books — that reflect a deeply experienced, broadly informed sense of judgment.
“They say historians peak in their 50s, once they’ve accumulated enough knowledge and wisdom to know what to look for,” says Mr. Rogoff. “By contrast, economists seem to peak much earlier. It’s hard to find an important paper written by an economist after 40.”

For the full story, see:
CATHERINE RAMPELL. “They Did Their Homework (800 Years of It).” The New York Times, SundayBusiness Section (Sun., July 4, 2010): 1 & 6.
(Note: the online version of the article is dated July 2, 2010.)
(Note: ellipses added.)

The reference for the book is:
Reinhart, Carmen M., and Kenneth Rogoff. This Time Is Different: Eight Centuries of Financial Folly. Princeton, NJ: Princeton University Press, 2009.

This-time-is-differentBK.jpg

Source of book image: http://www.paschaldonohoe.ie/wp-content/uploads/2010/02/This-time-is-different.jpg