Niall Ferguson: Don't Believe the Techno-Utopian Hype

facebook-tech-co03-ferguson
Kimihiro Hoshino / AFP-Getty Images

Are you a technoptimist or a depressimist? This is the question I have been pondering after a weekend hanging with some of the superstars of Silicon Valley.

I had never previously appreciated the immense gap that now exists between technological optimism, on the one hand, and economic pessimism, on the other. Silicon Valley sees a bright and beautiful future ahead. Wall Street and Washington see only storm clouds. The geeks think we're on the verge of The Singularity. The wonks retort that we're in the middle of a Depression.

Let's start with the technoptimists. Last Saturday I listened with fascination as a panel of tech titans debated the question: "Will science and technology produce more dramatic changes in solving the world's major problems over the next 25 years than have been produced over the last 25 years?"

They all thought so. We heard a description of what Google's Project Glass, the Internet-enabled spectacles, can already do. (For example, the spectacles can be used to check if another speaker is lying.) Next up: a search engine inside the brain itself. We heard that within the next 25 years, it will be possible to take 1,000-mile journeys by being fired through tubes. We also heard that biotechnology will deliver genetic "photocopies" of human organs that need replacing. And we were promised genetically engineered bugs, capable of excreting clean fuel. The only note of pessimism came from an eminent neuroscientist, who conceded that a major breakthrough in the prevention brain degeneration was unlikely in the next quarter century.

For a historian, all this technoptimism is hard to swallow. The harsh reality, as far as I can see, is that the next 25 years (2013-2038) are highly unlikely to see more dramatic changes than science and technology produced in the last 25 (1987-2012).

For a start, the end of the Cold War and the Asian economic miracle provided one-off, nonrepeatable stimuli to the process of innovation in the form of a massive reduction in labor costs and therefore the price of hardware, not to mention all those ex-Soviet Ph.D.s who could finally do something useful. The IT revolution that began in the 1980s was important in terms of its productivity impact inside the U.S.—though this shouldn't be exaggerated—but we are surely now in the realm of diminishing returns (the symptoms of which are deflation plus underemployment due partly to automation of unskilled work).

The breakthroughs in medical science we can expect as a result of the successful mapping of the human genome probably will result in further extensions of the average lifespan. But if we make no commensurate advances in neuroscience—if we succeed only in protracting the life of the body, but not the mind—we will simply increase the number of dependent elderly.

My pessimism is supported by a simple historical observation. The achievements of the last 25 years were actually not that big a deal compared with what we did in the preceding 25 years, 1961-1986 (e.g. landing men on the moon). And the 25 years before that, 1935-1960, were even more impressive (e.g. splitting the atom). In the words of Peter Thiel, perhaps the lone skeptic within a hundred miles of Palo Alto: In our youth we were promised flying cars. What did we get? 140 characters.

Moreover, technoptimists have to explain why the rapid scientific technological progress in those earlier periods coincided with massive conflict between armed ideologies. (Which was the most scientifically advanced society in 1932? Germany.)

So let me offer some simple lessons of history: More and faster information is not good in itself. Knowledge is not always the cure. And network effects are not always positive.

In many ways, the discussion I've just described followed logically from the previous week's widely reported spat between Peter Thiel and Eric Schmidt at the "Brainstorm Tech" conference in Aspen, where Schmidt took the technoptimistic line and Thiel responded with a classic depressimistic question: Why, if information technology is so great, have median wages stagnated in the nearly 40 years since 1973, whereas in the previous 40 years, between 1932 and 1972, they went up by a factor of six?

By the same token, there was great technological progress during the 1930s. But it did not end the Depression. That took a world war. So could something comparably grim happen in our own time? Don't rule it out. Let's remind ourselves of the sequence of events: economic depression, crisis of democracy, road to war.

Talk to anyone who manages money these days and you will hear a doleful litany: the global economic slowdown, the persistence of unemployment, widening inequality, the problem of excessive debt, the declining effectiveness of monetary policy, and the looming fiscal cliff. Only last week, Ray Dalio—founder of the mega– hedge fund Bridgewater—spoke of a "dangerous dynamic ... making a self-reinforcing global decline more likely." With good reason, Dalio frets about the dangers of a "debt implosion" or currency breakup in Europe.

In the 1930s economic disaster undermined weak democracies all over the world. The equivalent phenomenon in our own time is the seeming inability of any Western politician to get reelected (a jinx Barack Obama may find it very hard to beat in November). That, however, is no more than what you'd expect in a time of depression. More troubling is the evidence that our basic faith in democracy is being corroded.

In the past week, I have heard a politician admit that the generous benefits that have been promised to retired public workers are in danger of bankrupting the country. I have heard a leading entrepreneur complain that the revolving door leading from the Pentagon to defense contractors is a subtle form of corruption. And I have heard more than one reputable academic assert that the Chinese one-party system offers real advantages over our own antiquated system of democracy.

This is certainly the Chinese view. Viewed from Beijing, Western "participatory democracy" is defective in at least three ways. It is anti-intellectual (politicians are condemned if they are too "professorial"). It is short-sighted, to the detriment of future generations. And, if democracy is applied in multiethnic societies, it can lead to discrimination and even violence against minorities.

Sadly, not all of this is wrong. Democracy works best with constituency-based, bicameral parliaments under the rule of law, and works less well with proportional representation and referendums. That is one reason Europe is in such a mess. Democracy is chronically short-sighted, especially if there are major elections every two years. With our increasing lifespans (life expectancy was just over 50 when the U.S. Constitution was written, compared with 78 today), a case can surely be made for longer terms in office (say, 50 percent longer) and therefore less frequent elections.

As for the problem of corruption, it is all too real. But it takes two forms: the power of cash-rich vested interests as exemplified by the lobbyists on K Street; and the growing share of public-sector employees and welfare recipients relative to direct taxpayers in the electorate. If anything, it is the second of these that has been pushing the Western world ever deeper into debt over the past decade.

In the 1930s script, democratic decay is followed by conflict. I am not one of those who expects Europe's monetary meltdown to end in war. Europeans are too old, disarmed, and pacifist for there to be more than a few desultory urban riots this summer. But I am much less confident about peace to Europe's south and east. North Africa and the Middle East now have the ingredients in place for a really big war: economic volatility, ethnic tension, a youthful population, and an empire in decline—in this case the American Empire.

Weary of warfare and wakening up to the fossil-fuel riches made accessible by fracking, the United States is rapidly winding up four decades of hegemony in the Middle East. No one knows who or what will fill the vacuum. A nuclear Iran? A neo-Ottoman Turkey? Arab Islamists led by the Muslim Brotherhood? Whoever emerges on top, they are unlikely to get there without bloodshed.

It's a dangerous world. Ask anyone who works in the world of intelligence to list the biggest threats we face, and they'll likely include bioterrorism, cyber war, and nuclear proliferation. What these have in common, of course, is the way modern technology can empower radicalized (or just plain crazy) individuals and groups.

I wish I were a technoptimist. It must be heart-warming to believe that Facebook is ushering in a happy-clappy world where everybody "friends" everybody else and we all surf the net in peace (insert smiley face). But I'm afraid history makes me a depressimist. And no, there's not an app—or a gene—that can cure that.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go