Skip to content

The Fateful Nineties

Christopher Caldwell at First Things

Photo by Everyday basics / Unsplash

or Americans, the 1990s are both the most sharply defined and the most fuzzily understood of modern decades. The nineties began on 11/9/1989, with the breaching of the Berlin Wall by East Germans—a symbolic repudiation of communism and a glorious American victory in the Cold War. They ended on 9/11/2001, when al-Qaeda terrorists, most of them Saudi Arabians, flew two airplanes into the Twin Towers of the World Trade Center. President George W. Bush responded by launching the invasion of Iraq, which brought a historic military defeat and an even more consequential reputational one. At the start of the nineties, Americans seemed to possess unique insight into the principles on which modern economies and societies were built. At the end of the nineties, Americans were stunned to discover that the person with the best insight into their own country and its vulnerabilities was Osama bin Laden.

Something in the nineties had gone calamitously, tragically, but invisibly wrong. The United States had endured setbacks: the Los Angeles riots of 1992; various mid-decade standoffs, shoot-outs, and bombings, from Waco to Ruby Ridge to Oklahoma City; and the dot-com equities crash at century’s end. Yet there was scarcely an instant in the whole decade when the country’s strength, stability, and moral pre-eminence were questioned, at least in mainstream media outlets.

It was not as if nothing changed in the nineties—but almost all the changes seemed to make the position of the United States more secure. The country underwent the largest peacetime economic expansion in its history. The stock market boomed. Home ownership rose. The government showed more fiscal responsibility than it had in a generation, finishing the decade with annual budget surpluses. Government spending as a percentage of GDP fell to levels last seen in the 1960s. So did crime of all kinds.

Using computer networking technology devised by its military and refined by its scientists, bureaucrats, and hackers, the United States was managing the global transition to an information economy. The United States got to write the rules under which this transformation took place. That should have been a source of safety—but it turned out to be a source of peril. The Cold War victory, combined with a chance to redefine the economic relations that obtain among every human being on earth, was a temptation to Promethean excess. An exceptionally legalistic, hedonistic, and anti-traditional nation, the United States was poorly equipped to resist such a temptation. It misunderstood the victory it had won and the global reconstruction it was carrying out.

In a way, information won the Cold War. In the last two years of the Reagan administration, diplomatic channels to Moscow were wide open and Secretary of State George Shultz was developing a trusting relationship with Soviet leader Mikhail Gorbachev. One way in which Shultz wowed Gorbachev was by sharing data from analysts’ reports shown him by his friend, the former Citicorp chairman Walter Wriston. These reports mostly concerned the market for information: among other things, the effect on the finance industry of instantaneous fund transfers, the savings to be gained from replacing mined copper wires with fiber optic cables, and the declining cost of computing power. Wriston theorized that the increasing velocity of information was making longstanding ideas of Westphalian sovereignty impracticable in the West and longstanding means of party control impossible in the Eastern Bloc.

Wriston was wrong, as the example of twenty-first-century China would show. But soon after the Berlin Wall fell, some specialists were attributing that event to computing capacity. Economists had understood the Cold War this way for more than half a century. The so-called economic calculation problem had been noted in the 1930s by Friedrich von Hayek and Ludwig von Mises. The two Austrian economists argued that markets were an indispensable tool for pricing (and thus allocating) goods, and that their absence would introduce fatal inefficiencies into socialism. Hayek and Mises had long been thought to hold the losing intellectual hand. Now, it seemed, they had been vindicated.

In the last days of March 1990, Vytautas Landsbergis, the anti-communist parliamentary speaker of the Lithuanian Soviet Socialist Republic, was leading police and nationalist protesters in a standoff against Soviet army troops in Vilnius. The newly elected anti-communist president of Nicaragua, Violeta Chamorro, was taking over the country’s army from the Marxist Sandinistas who had built it. And one of the most distinguished followers of Hayek, James M. Buchanan, traveled to Australia to explain before a triumphalist audience of conservatives why all this stuff was happening. Buchanan, a Chicago-trained economist, pioneer of “public choice theory,” and Nobel Prize–winner in economics in 1986, would focus on three things: information, efficiency, and values. Because exchange is “complex,” Buchanan told the Australians, state planners are too far away from the action to “fully exploit the strictly localized information that emerges in the separate but interlinked markets.” Considering what we had come to know about the information carried in market prices, Buchanan was incredulous that so many had denied the superiority of free markets for so long.

Read decades later, his speech gives the sense that the triumph of free markets was on shakier intellectual ground than anyone understood at the time. Buchanan’s assumptions about “strictly localized” information—presumably from factories, shops, and households—reflected how the problem of gathering business information had been understood between the 1930s and 1990. But the internet would begin to draw a broad commercial public roughly three years later, and once it was up and running, almost no information would remain “strictly localized”—nor could it be kept private, except through measures that were themselves costly. A new tool for centralized, comprehensive, and efficient surveillance of market transactions was on its way—and with the invention of HTTP cookies, such information might simply be requisitioned, like grain stocked by Soviet peasants in the 1930s. Eventually, certain academic economists would suggest that, in enabling capitalism to ­triumph in practice, the personal computer had made socialism possible in principle. A new universe of economic possibilities was opening up.

And not just in the formerly communist world. Sweeping social change falls on the just and the unjust. Much as the putatively non-racist North had been altered by the Civil Rights Act of 1964 no less than the putatively racist South, so the putatively capitalist “free world” would be rocked by the lessons the socialist world was being taught about markets.

Buchanan naturally defended the market economy as more efficient. “It is now, in 1990, almost universally acknowledged that such an economy ‘works better’ than a socialized economy,” he told his Aussie listeners. “And the meaning of ‘works better’ is quite straightforward: the private-­ownership, individualized economy produces a higher valued bundle of goods and services from the resource capacities available to the individuals in a politically organized community.” But here, too, Buchanan was out on a limb. In the academy, the meaning of “works better” was indeed as straightforward as he said. But in society, the meaning of “works better” was not straightforward at all. An economic system produces more than consumer products. It produces attitudes, traditions, hierarchies, and geographies. The efficiency of an economic system, like the efficiency of a grammar school curriculum or a marriage regime, might not be evident until decades or generations later.

Only at the end of his discussion did Buchanan address “values,” a word that better than any other links social behaviors to commodity prices. He addressed it in a way that was strange and a bit disturbing. He did not tell his listeners to put their noses to the grindstone, as an economist of his grandfather’s generation might have done. He said this:

The only proviso here is that the value scalar, the measure through which disparate goods and services are ultimately compared, must be that which emerges from the voluntary exchange process itself. If the value scalar is, itself, determined by the centralized socialist planners, there is, of course, no reason to think that the private ownership economy will ‘work better’ in generating more ‘value’ along this measure.

The values in a capitalist system, in other words, must be all capitalist. What sells is what’s right. If citizens try to import their traditions and sentiments into the economic system, the system will seize up and cease to work. That used to be the main argument of capitalism’s foes. Marx and Engels warned that, under capitalism, “all fixed, fast-frozen relations, with their train of ancient and venerable prejudices and opinions, are swept away.” That capitalism’s evangelists now insisted on this point looked like a bait-and-switch. It introduced a different argument than the one Westerners had thought they were having. Most of the momentum for embracing capitalism had arisen from specific discontents with socialism: the shoddiness of American cars once the United Auto Workers were in control; London left freezing whenever the United Mine Workers wanted a raise; the censorship of Leonid Brezhnev; high income taxes; mediocrity; standardization.

Except among Ayn Rand’s readers, the twentieth-century argument about social systems almost never rested on the case for unconstrained free markets (though a generation of ­entrepreneurs and boosters would later begin casting the argument that way). Just as a good many idealistic socialists in the 1930s wound up muttering to themselves in the Gulag, “I didn’t ask for this . . . I just didn’t like my boss,” so a good many Americans who had voted for Ronald Reagan in the 1980s would later mutter at their twenty-first-­century rideshare jobs, “I didn’t ask for this . . . I just didn’t like the long lines at the DMV.” When, today, Western Europeans refer to the trente glorieuses or the Wirtschaftswunder, the thirty years they are calling “glorious” and “miraculous” are the social-democratic ones that followed World War II—certainly not the capitalist ones that followed the European Union’s 1992 Maastricht Treaty.

Buchanan’s invocation of values also revealed the capitalist system as fragile. Capitalism is the best system only so long as non-market principles don’t contaminate it. On its own, the capitalist system lacks the resources to defend itself against illiberal principles, should any be introduced. “With no overriding principle that dictates how an economy is to be organised,” he warned, “the political structure is open to maximal exploitation by the pressures of well-organised interests which seek to exploit the powers of the state to secure differential profits.” What few Americans understood at the time was that, since the Civil Rights Act of 1964, there already were illiberal principles at work at the heart of the American governmental machinery.

The nineties are remembered as a time of dizzying change, but the momentous stuff came into view only gradually. The early years of the decade had seemed an insipid aftermath to the banquet of optimism Reagan had served up. A recession in 1991, in a country that had grown unused to recessions, brought a collapse of the public’s faith in George H. W. Bush, a prohibitive favorite for reelection just months before. The way was cleared for the young governor of Arkansas. Bill Clinton is the protagonist of the American politics of the nineties; his predecessor, president during three of the ten years of the decade, is missing from most people’s memories of the time.

In the summer of 1994, Rwandans were still pouring across the border into Zaire after the recent genocide in their country. President Clinton, having failed to pass his controversial national health plan, was lobbying Congress to pass an assault weapons ban that might serve as a centerpiece in the upcoming midterm elections. It did—and played a big role in the Democrats’ historic fifty-four-seat loss. Meanwhile, the advance of computer networks was becoming central to Americans’ understanding of their economy and society. Four writers on technology (Esther Dyson, George Gilder, George Keyworth, and Alvin Toffler) had noticed that the internet was now “huge,” with a scarcely believable 2.2 million computers connected to it. Not themselves inventors but rather internet theorists and ideologues of long standing, they decided to write what they called a “Magna Carta for the Knowledge Age.”

It was a typical product of the time. Ambitious people, tipped off that a new era was dawning, volunteered to be the Thomas Jeffersons, even the John Lockes, of the information age, on the strength of a memo dashed off one afternoon after lunch or a neologism coined at a breakfast meeting. “The central event of the 20th century is the overthrow of matter,” the authors portentously began. “In technology, economics, and the politics of nations, wealth—in the form of physical resources—has been losing value and significance. The powers of mind are everywhere ascendant over the brute force of things.”

The Gilder-Dyson manifesto may have inspired the more famous “Declaration of the Independence of Cyberspace” by the countercultural activist John Perry Barlow, which made a splash when Barlow declaimed it at the Davos World Economic Forum: “Governments of the Industrial World, you weary giants of flesh and steel,” it ran in part, “I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.” Though celebrated in its time, Barlow’s declaration is today generally cited to comic effect.

Spoiler alert: The technological changes underway in the early Clinton years did not do away with matter. There was more of it every day, in fact! China, which had had 5.5 million motor vehicles in 1990, would have three times as many in 2000. The factory hub of Shenzhen began the decade with 1.7 million people and ended with more than 7 million, though Apple would not make it the center of its manufacturing operations until the following decade. But the manifesto authors didn’t see that. They were far from the places where matter was being extracted and assembled: the Nike factories in Vietnam, the TSMC chip factory in Taiwan, not to mention the Chilean rare-earth mines and the hitherto unreachable crannies out of which fossil fuels were being slant-drilled and hydraulically fractured. The United States had gone to war against Iraq in 1990 rather than accept Saddam Hussein’s contention that he had invaded Kuwait because it had slant-drilled its way into oil deposits that lay under Iraqi soil. In 1991, with the opening of the Barnett Shale, a colossal “tight gas” deposit just outside of Fort Worth, fracking got underway in earnest. It was there, in 1997 and 1998, that many of the techniques were developed that would turn the United States into a net energy exporter in the twenty-first century.

On the question of whether we were living in a material world, Madonna was apparently better informed than Esther Dyson. What was going on was not the volatilization of matter but the internationalization of the division of labor, such that the United States could profit from the production of matter without (in most cases) suffering from its proximity. Global energy-related CO2 emissions rose by 12 percent during the nineties, but in the United States the Environmental Protection Agency boasted of steep drops in the air concentration of virtually all pollutants.

The writers of the cyber–Magna Carta held the then-unanimous opinion that the internet would be a revolutionarily open place: “As America continued to explore new frontiers,” they enthused, “from the Northwest Territory to the Oklahoma land-rush—it consistently returned to this fundamental principle of rights, reaffirming, time after time, that power resides with the people. Cyberspace is the latest American frontier.” This assertion would prove mostly wrong. Many of the cocksure policy predictions in the document were lifted from Toffler’s bestseller, The Third Wave, written during the Carter administration, nearly two decades earlier: “The reality is that a Third Wave government will be vastly smaller (perhaps by 50 percent or more) than the current one.” The authors also believed that censorship was not only being eliminated but becoming unthinkable:

For government to insist on the right to peer into every computer . . . for government to influence which political viewpoints would be carried over the airwaves . . . might have made sense in a Second Wave world. . . . [It makes] no sense at all in the Third Wave.

Read the rest (subscription probably required)

Comments

Latest