The Real World War IV

The Real World War IV

Andrew J. Bacevich

America’s political and military efforts in the Middle East go by many names: War on terror. Clash of civilizations. Democratization. But our author argues that all of these undertakings grow from a fateful decision made decades ago that the American way of life requires unlimited access to foreign oil.

Share:
Read Time:
54m 11sec

In the eyes of its most impassioned supporters, the global war on terror constitutes a de facto fourth world war: The conflict that erupted with the attacks on the World Trade Center and the Pentagon is really a sequel to three previous conflicts that, however different from one another in terms of scope and duration, have defined contemporary history.

According to this interpretation, most clearly articulated by the neoconservative thinker Norman Podhoretz in the pages of Commentary magazine, the long twilight struggle between communism and democratic capitalism qualifies as the functional equivalent of World War I (1914–18) and World War II (1939–45). In retrospect, we can see that the East-West rivalry commonly referred to as the Cold War was actually World War III (1947–89). After a brief interval of relative peace, corresponding roughly to the 1990s, a fourth conflict, comparable in magnitude to the previous three, erupted on September 11, 2001. This fourth world war promises to continue indefinitely.

Classifying the war on terror as World War IV offers important benefits. It fits the events of September 11 and thereafter into a historical trope familiar to almost all Americans, and thereby offers a reassuring sense of continuity: We’ve been here before; we know what we need to do; we know how it ends. By extension, the World War IV construct facilitates efforts to mobilize popular support for U.S. military actions undertaken in pursuit of final victory. It also ratifies the claims of federal authorities, especially those in the executive branch, who insist on exercising “wartime” prerogatives by expanding the police powers of the state and circumscribing constitutional guarantees of due process. Further, it makes available a stock of plausible analogies to help explain the otherwise inexplicable—the dastardly events of September 11, 2001, for example, are a reprise of the dastardly surprise of December 7, 1941. Thus, the construct helps to preclude awkward questions. It disciplines.

But it also misleads. Lumping U.S. actions since 9/11 under the rubric of World War IV can too easily become an exercise in sleight of hand. According to hawks such as Podhoretz, the chief defect of U.S. policy before 9/11 was an excess of timidity. America’s actual problem has been quite the reverse.

The key point is this. At the end of the Cold War, Americans said “yes” to military power. Indeed, ever since Vietnam, Americans have evinced a deepening infatuation with armed force, soldiers, and military values. By the end of the 20th century, the skepticism about arms and armies that informed the American experiment from its founding had vanished. Political leaders, liberals and conservatives alike, became enamored of military might. Militarism insinuated itself into American life.

The ensuing affair has had a heedless, Gatsby-like aspect, a passion pursued in utter disregard of any likely consequences. Few in power have openly considered whether valuing military power for its own sake or cultivating permanent global military superiority might be at odds with American principles.

To the extent that some Americans are cognizant of a drift toward militarism by their country, the declaration of World War IV permits them to suppress any latent anxiety about that tendency. After all, according to precedent, a world war—by definition, a conflict thrust upon the United States—changes everything. Responsibility for world wars lies with someone else: with Germany in 1917, Japan in 1941, or the Soviet Union after 1945. Designating the several U.S. military campaigns initiated in the aftermath of 9/11 as World War IV effectively absolves the United States of accountability for anything that went before. Blame lies elsewhere: with Osama bin Laden and Al Qaeda, with Saddam Hussein and his Baath Party thugs, with radical Islam. America’s responsibility is to finish what others started.

But this militaristic predisposition, evident in the transformation of American thinking about soldiers, the armed services, and war itself since Vietnam, cannot of itself explain the rising tide of American bellicosity that culminated in March 2003 with the invasion of Iraq. We must look as well to national interests and, indeed, to the ultimate U.S. interest, which is the removal of any obstacles or encumbrances that might hinder the American people in their pursuit of happiness ever more expansively defined. Rather than timidity or trepidation, it is unabashed confidence in the strength of American arms, combined with an unswerving determination to perfect American freedom, that has landed us in our present fix.

During the 1980s and 1990s, this combustible mix produced a shift in the U.S. strategic center of gravity, overturning geopolitical priorities that had long appeared sacrosanct. A set of revised strategic priorities emerged, centered geographically in the energy-rich Persian Gulf but linked inextricably to the assumed prerequisites for sustaining Amer­ican freedom at home. A succession of administrations, Republican and Democratic, opted for armed force as the preferred means to satisfy those new priorities. In other words, a new set of strategic imperatives, seemingly conducive to a military solution, and a predisposition toward militarism together produced the full-blown militarization of U.S. policy so much in evidence since 9/11.

The convergence between preconditions and interests suggests an altogether different definition of World War IV—a war that did not begin on 9/11, does not have as its founding purpose the elimination of terror, and does not cast the United States as an innocent party. This alternative conception of a fourth world war constitutes not a persuasive rationale for the exercise of U.S. military power in the manner pursued by the administration of George W. Bush, but the definitive expression of the dangers posed by the new American militarism. Waiting in the wings are World Wars V and VI, to be justified, inevitably, by the ostensible demands of freedom.

Providing a true account of World War IV requires that it first be placed in its correct relationship to World War III, the Cold War. As the great competition between the United States and the Soviet Union slips further into the past, scholars work their way toward an ever more fine-grained interpretation of its origins, conduct, and implications. Yet as far as public perceptions of the Cold War are concerned, these scholars’ diligence goes largely unrewarded. When it comes to making sense of recent history, the American people, encouraged by their political leaders, have shown a demonstrable preference for clarity rather than nuance. Even as the central events of the Cold War recede into the distance, the popular image of the larger drama in which these events figured paradoxically sharpens.

“Cold War” serves as a sort of self-explanatory, all-purpose label, encompassing the entire period from the mid-1940s through the late 1980s. And since what is past is prologue, this self-contained, internally coherent, authoritative rendering of the recent past is ideally suited to serve as a template for making sense of events unfolding before our eyes.

From a vantage point midway through the first decade of the 21st century, the commonly accepted metanarrative of our time consists of three distinct chapters. The first, beginning where World War II leaves off, recounts a period of trial and tribulation lasting several decades but ending in an unambiguous triumph for the United States. The next describes a short-lived “post–Cold War era,” a brief, dreamy interlude abruptly terminated by 9/11. The second chapter gives way to a third, still in the process of being written but expected to replicate in broad outlines the first—if only the United States will once again rise to the occasion. This three-part narrative possesses the virtues of simplicity and neatness, but it is fundamentally flawed. Perhaps worst of all, it does not alert Americans to the full dimensions of their present-day predicament. Instead, the narrative deceives them. It would be far more useful to admit to a different and messier parsing of the recent past.

For starters, we should recognize that, far from being a unitary event, the Cold War occurred in two distinct phases. The first, defined as the period of Soviet-American competition that could have produced an actual World War III, essentially ended by 1963. In 1961, by acquiescing in the erection of the Berlin Wall, Washington affirmed its acceptance of a divided Europe. In 1962, during the Cuban Missile Crisis, Washington and Moscow contemplated the real prospect of mutual annihilation, blinked more or less simultaneously, and tacitly agreed to preclude any recurrence of that frightening moment. A more predictable, more stable relationship ensued, incorporating a certain amount of ritualistic saber rattling but characterized by careful adherence to a well-established set of routines and procedures.

Out of stability came opportunities for massive stupidity. During the Cold War’s second phase, from 1963 to 1989, both the major protagonists availed themselves of these opportunities by pursuing inane adventures on the periphery. In the 1960s, of course, Americans plunged into Vietnam, with catastrophic results. Beginning in 1979, the Soviets impaled themselves on Afghanistan, with results that proved altogether fatal. Whereas the inherent resilience of democratic capitalism enabled the United States to repair the wounds it had inflicted on itself, the Soviet political economy lacked recuperative powers. During the course of the 1980s, an already ailing Soviet empire became sick unto death.

The crucial developments hastening the demise of the Soviet empire emerged from within. When the whole ramshackle structure came tumbling down, Andrei Sakharov, Václav Havel, and Karol Wojtyla, the Polish prelate who became Pope John Paul II, could claim as much credit for the result as Ronald Reagan, if not more. The most persuasive explanation for the final outcome of the Cold War is to be found in Soviet ineptitude, in the internal contradictions of the Soviet system, and in the courage of the dissidents who dared to challenge Soviet authority.

In this telling of the tale, the Cold War remains a drama of compelling moral significance. But shorn of its triumphal trappings, the tale has next to nothing to say about the present-day state of world affairs. In a post-9/11 world, it possesses little capacity either to illuminate or to instruct. To find in the recent past an explanation of use to the present requires an altogether different narrative, one that resurrects the largely forgotten or ignored story of America’s use of military power for purposes unrelated to the Soviet-American rivalry.

The fact is that, even as the Cold War was slowly reaching its denouement, World War IV was already under way—indeed, had begun two full decades before September 2001. So World Wars III and IV consist of parallel rather than sequential episodes. They evolved more or less in tandem, with the former overlaid on, and therefore obscuring, the latter.

The real World War IV began in 1980, and Jimmy Carter, of all people, declared it. To be sure, Carter acted only under extreme duress, prompted by the irrevocable collapse of a policy to which he and his seven immediate predecessors had adhered—specifically, the arrangements designed to guarantee the United States a privileged position in the Persian Gulf. For Cold War–era U.S. policymakers, preoccupied with Europe and East Asia as the main theaters of action, the gulf had figured as something of a sideshow before 1980. Jimmy Carter changed all that, thrusting it into the uppermost tier of U.S. geopolitical priorities.

From 1945 through 1979, the aim of U.S. policy in the gulf region had been to ensure stability and American access, but to do so in a way that minimized overt U.S. military involvement. Franklin Roosevelt had laid down the basic lines of this policy in February 1945 at a now-famous meeting with King Abd al-Aziz Ibn Saud
of Saudi Arabia. Henceforth, Saudi Arabia could count on the United States to guarantee its security, and the United States could count on Saudi Arabia to provide it preferential treatment in exploiting the kingdom’s vast, untapped reserves of oil.

From the 1940s through the 1970s, U.S. strategy in the Middle East adhered to the military principle known as economy of force. Rather than establish a large presence in the region, Roosevelt’s successors sought to achieve their objectives in ways that entailed a minimal expenditure of American resources and, especially, U.S. military power. From time to time, when absolutely necessary, Washington might organize a brief show of force—in 1946, for example, when Harry Truman ordered the USS Missouri to the eastern Mediterranean to warn the Soviets to cease meddling in Turkey, or in 1958, when Dwight Eisenhower sent U.S. Marines into Lebanon for a short-lived, bloodless occupation—but these modest gestures proved the exception rather than the rule.

The clear preference was for a low profile and a hidden hand. Although by no means averse to engineering “regime change” when necessary, the United States preferred covert action to the direct use of force. To police the region, Washington looked to surrogates—British imperial forces through the 1960s, and, once Britain withdrew from “east of Suez,” the shah of Iran. To build up the indigenous self-defense (or regime defense) capabilities of select nations, it arranged for private contractors to provide weapons, training, and advice. The Vinnell Corporation’s ongoing “modernization” of the Saudi Arabian National Guard (SANG), a project now well over a quarter-century old, remains a prime example.

By the end of 1979, however, two events had left this approach in a shambles. The first was the Iranian Revolution, which sent the shah into exile and installed in Tehran an Islamist regime adamantly hostile to the United States. The second was the Soviet invasion of Afghanistan, which put the Red Army in a position where it appeared to pose a direct threat to the entire Persian Gulf—and hence to the West’s oil supply.

Faced with these twin crises, Jimmy Carter concluded that treating the Middle East as a secondary theater, ancillary to the Cold War, no longer made sense. A great contest for control of the region had been joined. Rejecting out of hand any possibility that the United States might accommodate itself to the changes afoot in the Persian Gulf, Carter claimed for the United States a central role in determining exactly what those changes would be. In January 1980, to forestall any further deterioration of the U.S. position in the gulf, he threw the weight of American military power into the balance. In his State of the Union address, the president enunciated what became known as the Carter Doctrine. “An attempt by any outside force to gain control of the Persian Gulf region,” he declared, “will be regarded as an assault on the vital interests of the United States of America, and such an assault will be repelled by any means necessary, including military force.”

From Carter’s time down to the present day, the doctrine bearing his name has remained sacrosanct. As a consequence, each of Carter’s successors has expanded the level of U.S. military involvement and operations in the region. Even today, American political leaders cling to the belief that skillful application of military power will enable the United States to decide the fate not simply of the Persian Gulf proper but of the entire greater Middle East. This gigantic project, begun in 1980 and now well into its third decade, is the true World War IV.

What prompted Jimmy Carter, the least warlike of all recent U.S. presidents, to take this portentous step? The Pentagon’s first Persian Gulf commander, Lieutenant General Robert Kingston, offered a simple answer when he said that his basic mission was “to assure the unimpeded flow of oil from the Arabian Gulf.” But General Kingston was selling his president and his country short. What was true of the three other presidents who had committed the United States to world wars—Woodrow Wilson, FDR, and Truman—remained true in the case of Carter and World War IV as well. The overarching motive for action was preservation of the American way of life.

By the beginning of 1980, a chastened Jimmy Carter had learned a hard lesson: It was not the prospect of making do with less that sustained American-style liberal democracy, but the promise of more. Carter had come to realize that what Americans demanded from their government was freedom, defined as more choice, more opportunity, and, above all, greater abundance, measured in material terms. That abundance depended on assured access to cheap oil—and lots of it.

In enunciating the Carter Doctrine, the president was reversing course, effectively renouncing his prior vision of a less materialistic, more self-reliant democracy. Just six months earlier, this vision had been the theme of a prescient, but politically misconceived, address to the nation, instantly dubbed by pundits the “Crisis of Confidence” speech, though, in retrospect, perhaps better called “The Road Not Taken.”

Carter’s short-lived vision emerged from a troubled context. By the third year of his presidency, economic conditions as measured by postwar standards had become dire. The rates of inflation and unemployment were both high. The prime lending rate was 15 percent and rising. Trends in both the federal deficit and the trade balance were sharply negative. Conventional analysis attributed U.S. economic woes to the nation’s growing dependence on increasingly expensive foreign oil.

In July 1979, Carter already anticipated that a continuing and unchecked thirst for imported oil was sure to distort U.S. strategic priorities, with unforeseen but adverse consequences. (When Carter spoke, the United States was importing approximately 43 percent of its annual oil requirement; today it imports 56 percent.) He feared the impact of that distortion on an American democracy still reeling from the effects of the 1960s. So on July 15 he summoned his fellow citizens to change course, to choose self-sufficiency and self-reliance—and therefore true independence. But the independence was to come at the cost of collective sacrifice and lowered expectations.

Carter spoke that night of a nation facing problems “deeper than gasoline lines or energy shortages, deeper even than inflation or depression.” The fundamental issue, in Carter’s view, was that Americans had turned away from all that really mattered. In a nation once proud of hard work among strong, religious families and close-knit communities, too many Americans had come to worship self-indulgence and consumption. What you owned rather than what you did had come to define human identity. But according to Carter, owning things and consuming things did not satisfy our longing for meaning. Americans were learning that piling up goods could  fill the emptiness of lives devoid of real purpose.

This moral crisis had brought the United States to a historic turning point. Either Americans could persist in pursuing “a mistaken idea of freedom” based on “fragmentation and self-interest” and inevitably “ending in chaos and immobility,” or they could opt for “true freedom,” which Carter described as “the path of common purpose and the restoration of American values.”

How the United States chose to deal with its growing reliance on foreign oil would determine which of the two paths it followed. Energy dependence, according to the president, posed “a clear and present danger” to the nation, threatening the nation’s security as well as its economic well-being. Dealing with this threat was “the standard around which we can rally.” “On the battlefield of energy,” declared Carter, “we can seize control again of our common destiny.”

How to achieve this aim? In part, by restricting oil imports, investing in alternative sources, limiting the use of oil by the nation’s utilities, and promoting public transportation. But Carter placed the larger burden squarely in the lap of the American people. The hollowing out of American democracy required a genuinely democratic response. “There is simply no way to avoid sacrifice,” he insisted, calling on citizens as “an act of patriotism” to lower thermostats, observe the highway speed limit, use carpools, and “park your car one extra day per week.”

Although Carter’s stance was relentlessly inward looking, his analysis had important strategic implications. To the extent that “foreign oil” refers implicitly to the Persian Gulf—as it did then and does today—Carter was in essence proposing to annul the growing strategic importance attributed to that region. He sensed intuitively that a failure to reverse the nation’s energy dependence was sure to draw the United States ever more deeply into the vortex of Persian Gulf politics, which, at best, would distract attention from the internal crisis that was his central concern, but was even more likely to exacerbate it.

But if Carter was prophetic when it came to the strategic implications of growing U. S. energy dependence, his policy prescription reflected a fundamental misreading of his fellow countrymen. Indeed, as Garry Wills has observed, given the country’s propensity to define itself in terms of growth, it triggered “a subtle panic [and] claustrophobia” that Carter’s political adversaries wasted no time in exploiting. By January 1980, it had become evident that any program summoning Americans to make do with less was a political nonstarter. The president accepted this verdict. The promulgation of the Carter Doctrine signaled his capitulation.

Carter’s about-face did not achieve its intended political purpose of preserving his hold on the White House—Ronald Reagan had already tagged Carter as a pessimist, whose temperament was at odds with that of the rest of the country—but it did set in motion a huge shift in U.S. military policy, the implications of which gradually appeared over the course of the next two decades. Critics might cavil that the militarization of U.S. policy in the Persian Gulf amounted to a devil’s bargain, trading blood for oil. Carter saw things differently. On the surface the exchange might entail blood for oil, but beneath the surface the aim was to guarantee the ever-increasing affluence that underwrites the modern American conception of liberty. Without exception, every one of Carter’s successors has tacitly endorsed this formulation. Although the result was not fully apparent until the 1990s, changes in U.S. military posture and priorities gradually converted the gulf into the epicenter of American grand strategy and World War IV’s principal theater of operations.

“Even if there were no Soviet Union,” wrote the authors of NSC-68, the spring 1950 U.S. National Security Council document that became the definitive statement of America’s Cold War grand strategy, “we would face the great problem of the free society, accentuated many fold in this industrial age, of reconciling order, security, the need for participation, with the requirement of freedom. We would face the fact that in a shrinking world the absence of order among nations is becoming less and less tolerable.” Some three decades later, with the Soviet Union headed toward oblivion, the great problem of the free society to which NSC-68 alluded had become, if anything, more acute. But conceiving the principles to guide U.S. policy turned out to be a more daunting proposition in World War IV than it had been during any of the three previous world wars. Throughout the 1980s and 1990s, U.S. policymakers grappled with this challenge, reacting to crises as they occurred and then insisting after the fact that their actions conformed to some larger design. In fact, only after 9/11 did a fully articulated grand strategy take shape. George W. Bush saw the antidote to intolerable disorder as the transformation of the greater Middle East through the sustained use of military power.

Further complicating the challenge of devising a strategy for World War IV was the fundamental incompatibility of two competing U.S. interests in the region. The first was a steadily increasing dependence on oil from the Middle East. Dependence meant vulnerability, as the crippling oil shocks of the 1970s,
administered by the Organization
of Petroleum Exporting Countries (OPEC), amply demonstrated. As late as World War II, the United States had been the world’s Saudi Arabia, producing enough oil to meet its own needs and those of its friends and allies. By the end of the 20th century, with Americans consuming one out of every four barrels of oil produced worldwide, the remaining U.S. reserves accounted for less than two percent of the world’s total. Projections showed the leverage of Persian Gulf producers mushrooming in the years to come, with oil exports from the region expected to account for between 54 and 67 percent of world totals by 2020.

The second U.S. interest in the region, juxtaposed against Arab oil, was Israel. America’s commitment to the security of the Jewish state complicated U.S. efforts to maintain cordial relations with oil-exporting states in the Persian Gulf. Before the Six-Day War (1967), the United States had tried to manage this problem by supporting Israel’s right to exist but resisting Israeli entreaties to forge a strategic partnership. After 1967, that changed dramatically. The United States became Israel’s preeminent international supporter and a generous supplier of economic and military assistance.

The Arab-Israeli conflict could not be separated from World War IV, but figuring out exactly where Israel fit in the larger struggle proved a perplexing problem for U.S. policymakers. Was World War IV a war of blood-for-oil-for-freedom in which Israel figured, at best, as a distraction and, at worst, as an impediment? Or was it a war of blood-for-oil-for-freedom in which the United States and Israel stood shoulder to shoulder in a common enterprise? For the first 20 years of World War IV, the American response to these questions produced a muddle.

During his final year in office, then, Carter initiated America’s new world war. Through his typically hapless and ineffectual effort to rescue the Americans held hostage in Iran, he sprinkled the first few driblets of American military power onto the surface of the desert, where they vanished without a trace. The rescue effort, dubbed Desert One, remained thereafter the gold standard for how not to use force, but it by no means curbed America’s appetite for further armed intervention in the region. Ronald Reagan gave the spigot labeled “military power” a further twist—and in so doing, he opened the floodgates. Although Carter declared World War IV, the war was fully, if somewhat haphazardly, engaged only on Reagan’s watch.

Reagan himself professed to be oblivious to the war’s existence. After all, his immediate preoccupation was with World War III. For public consumption, the president was always careful to justify the U.S. military buildup of the 1980s as a benign and defensive response to Cold War imperatives. All that the United States sought was to be at peace. “Our country has never started a war,” Reagan told the annual Veterans of Foreign Wars convention in 1983. “Our sole objective is deterrence, the strength and capability it takes to prevent war.” “We Americans don’t want war and we don’t start fights,” he insisted on another occasion. “We don’t maintain a strong military force to conquer or coerce others.”

This was, of course, at least 50 percent bunkum. During the Reagan era, with the first stirrings of revived American militancy, defense and deterrence seldom figured as the operative principles. In fact, the American military tradition has never viewed defense as anything other than a pause before seizing the initiative and taking the fight to the enemy.

Partisan critics saw Reagan’s muscle flexing as the actions of a reckless ideologue unnecessarily stoking old Cold War tensions. Viewing events in relation to Vietnam and the Cuban Missile Crisis, they forecast dreadful consequences. Reagan’s defenders, then and later, told a different story: Having intuitively grasped that the Soviet system was in an advanced state of decay, Reagan proceeded with skill and dexterity to exploit the system’s economic, technological, and moral vulnerabilities; the ensuing collapse of the Soviet empire proved conclusively that Reagan had gotten things right. Today neither interpretation, Reagan as trigger-happy cold warrior or Reagan as master strategist, is especially persuasive. Assessing the military record of the Reagan years from a post–9/11 perspective yields a set of different and arguably more relevant insights.

Looking back, we can see that the entire Reagan era was situated on the seam between a world war that was winding down and another that had begun but was not yet fully comprehended. Although preoccupied with waging the Cold War, Reagan and his chief advisers, almost as an afterthought, launched four forays into the Islamic world, with mixed results: the insertion of U.S. Marine “peacekeepers” into Lebanon, culminating in the Beirut bombing of October 1983; clashes with Libya, culminating in punitive U.S. strikes against targets in Tripoli and Benghazi in April 1986; the so-called tanker war of 1984–88, culminating in the commitment of U.S. forces to protect the flow of oil from the Persian Gulf; and American assistance throughout the 1980s to Afghan “freedom fighters,” culminating in the Soviet army’s ouster from Afghanistan. These actions greatly enhanced the ability of the United States to project military power into the region, but they also emboldened the enemy and contributed to the instability that drew Reagan’s successors more deeply into the region.

The nominal stimulus for action in each case varied. In Lebanon, the murkiest of the four, Reagan ordered marines ashore at the end of September 1982 “to establish an environment which will permit the Lebanese Armed Forces to carry out their responsibilities in the Beirut area.” This was a daunting proposition, given that Leban­on, divided by a civil war and variously occupied by the Syrian army, the Israeli Defense Forces, and (until its recent eviction) the Palestinian Liberation Organization, possessed neither an effective military nor an effective government and had little prospect of acquiring either. Vague expectations that a modest contingent of U.S. peacekeepers camped in Beirut might help restore stability to Lebanon motivated Reagan to undertake this risky intervention, which ended disastrously when a suicide bomber drove into the marine compound, killing 241 Americans.

In the case of Libya, Muammar al-Qaddafi’s declared intention of denying the U.S. Sixth Fleet access to the Gulf of Sidra, off Libya’s coast, had led to preliminary skirmishing in 1981 and again in March 1986. But it was Qaddafi’s support for terrorism and, especially, alleged Libyan involvement in the bombing of a Berlin disco frequented by GIs that prompted Reagan to order retaliation.

In the tanker war, Reagan was reacting to attacks perpetrated by both Iran and Iraq against neutral shipping in the Persian Gulf. Since 1980, the two nations had been locked in an inconclusive conflict. As that struggle spilled over into the adjacent waters of the gulf, it reduced the availability of oil for export, drove up insurance rates, and crippled merchant shipping. An Iraqi missile attack on the USS Stark on May 17, 1987, brought things to a head. Iraq claimed that the incident, which killed 37 sailors, had been an accident, and offered compensation. The Reagan administration used the Stark episode to blame Iran for the escalating violence. In short order, Kuwaiti supertankers were flying the Stars and Stripes, and U.S. forces were conducting a brisk campaign to sweep Iranian air and naval units out of the gulf.

In the case of Afghanistan, Reagan built on a program already in existence but hidden from public view. In July 1979, the Carter administration had agreed to provide covert assistance to Afghans resisting the pro-Soviet regime in Kabul. According to Zbigniew Brzezinski, Carter’s national security adviser, the aim was to induce a Soviet military response, thereby “drawing the Russians into the Afghan trap.” When the Soviets did invade, in December 1979, they became bogged down in a guerrilla war against the U.S.-backed mujahideen. Reagan inherited this project, initially sustained it, and then, in 1985, greatly stepped up the level of U.S. support for the Afghan resistance.

At first glance, these four episodes seem to be all over the map, literally and in terms of purpose, means, and outcome. Contemporaneous assessments tended to treat each in isolation from the others and to focus on near-term outcomes. “After the attack on Tripoli,” Reagan bragged, “we didn’t hear much more from Qaddafi’s terrorists.” Nonsense, replied critics, pointing to the suspected Libyan involvement (since confirmed) in the bombing of Pan American flight 103 in December 1988 and in the midair destruction of a French DC-10 nine months later. When a ceasefire in 1988 ended the fighting between Iran and Iraq, Secretary of Defense Caspar Weinberger assessed U.S. involvement in the tanker war as a major achievement. “We had now clearly won,” he wrote in 1990. With several hundred thousand U.S. troops deploying to the gulf that very same year to prepare for large-scale war, Weinberger’s claims of victory seemed, at best, premature.

To be sure, Reagan himself labored to weave together a comprehensive rationale for the various military actions he ordered, but the result amounted to an exercise in mythmaking. To listen to him, all these disparate threats—Soviet leaders pursuing global revolution, fundamentalists bent on propagating Islamic theocracies, Arab fascists such as Libya’s Qaddafi and Syria’s Hafez al-Assad, fanatical terrorists such as Abu Nidal—morphed into a single conspiracy. To give way to one element of that conspiracy was to give way to all, so the essential thing was to hold firm everywhere for peace.

Further muddying the waters were administration initiatives seemingly predicated on an assumption that no such overarching conspiracy against peace actually existed, or at least that selective U.S. collaboration with evildoers was permissible. The Reagan administration’s notorious “tilt” toward Saddam Hussein in the Iran-Iraq War, offering intelligence and commercial credits to the region’s foremost troublemaker—perhaps the final U.S. effort to enlist a proxy to secure its Persian Gulf interests—provides one example. Such opportunism made a mockery of Reagan’s windy pronouncements regarding America’s role as peacemaker and fed suspicions that the president’s rhetoric was actually intended to divert attention from his administration’s apparent strategic disarray.

Considered from a post-9/11 vantage point, however, Reagan-era uses of force in Lebanon, Libya, Afghanistan, and the tanker war do cohere, at least in a loose sort of way. First, and most notably, all four initiatives occurred in the greater Middle East, hitherto not the site of frequent U.S. military activity. Second, none of the four episodes can be fully understood except in relation to America’s growing dependence on imported oil. Although energy considerations did not drive U.S. actions in every instance, they always loomed in the background. Lebanon, for example, was not itself an oil exporter, but its woes mattered to the United States because instability there threatened to undermine the precarious stability of the region as a whole.

The four episodes constituting Reagan’s Islamic quartet were alike in one other way. Although each yielded a near-term outcome that the administration touted as conclusive, the actual results turned out to be anything but. Rather, each of the four pointed toward ever-deepening American military engagement.

The true significance of Reagan’s several interventions in the Islamic world lies not in the events themselves but in the response they evoked from the U.S. national security apparatus. A consensus emerged that, in the list of pressing U.S. geopolitical concerns, the challenges posed by the politically volatile, energy-rich world of Islam were eclipsing all others, including the size of the Soviet nuclear arsenal and the putative ambitions of the Soviet politburo. Given the imperative of meeting popular expectations for ever-greater abundance (which meant importing ever-larger quantities of oil)—Jimmy Carter’s one-term presidency having demonstrated the political consequences of suggesting a different course—the necessary response was to put the United States in a position to determine the fate of the Middle East. That meant forces, bases, and infrastructure. Only by enjoying unquestioned primacy in the region could the government of the United States guarantee American prosperity—and thus American freedom.

From the outset, dominance was the aim and the driving force behind U.S. actions in World War IV—not preventing the spread of weapons of mass destruction, not stemming the spread of terror, certainly not liberating oppressed peoples or advancing the cause of women’s rights. The prize was mastery over a region that leading members of the American foreign-policy elite, of whatever political persuasion, had concluded was critically important to the well-being of the United States. The problem, at its very core, demanded a military solution.

In March 1984, Donald Rumsfeld, out of power but serving as a Reagan administration troubleshooter, told Secretary of State George Shultz that Lebanon was a mere “sideshow.” The main show was the Persian Gulf; instability there “could make Lebanon look like a taffy pull.” According to Shultz’s memoir, Turmoil and Triumph (1993), Rumsfeld worried that “we are neither organized nor ready to face a crisis there.” In fact, the effort to reorganize was already under way. And here is where Reagan made his most lasting contribution to the struggle to which Jimmy Carter had committed the United States.

Seven specific initiatives figured prominently in the Reagan administration’s comprehensive effort to ramp up America’s ability to wage World War IV:

• The upgrading in 1983 of the Rapid Deployment Joint Task Force, the Persian Gulf intervention force created by Carter after the Soviet incursion into Afghanistan, to the status of a full-fledged regional headquarters, U.S. Central Command.

• The accelerated conversion of Diego Garcia, a tiny British-owned island in the Indian Ocean, from a minor U.S. communications facility into a major U.S. forward support base.

• The establishment of large stocks of supplies and equipment, preloaded on ships and positioned to facilitate the rapid movement of U.S. combat forces to the Persian Gulf.

• The construction or expansion of airbases, ports, and other fixed locations required to receive and sustain large-scale U.S. expeditionary forces in Egypt, Saudi Arabia, Oman, Kenya, Somalia, and other compliant states.

• The negotiation of overflight rights and agreements to permit U.S. military access to airports and other facilities in Morocco, Egypt, and elsewhere in the region to support the large-scale introduction of U.S. troops.

• The refinement of war plans and the development of exercise programs to acclimate U.S. forces to the unfamiliar and demanding desert environment.

• The redoubling of efforts to cultivate client states through arms sales and training programs, the latter administered either by the U.S. military or by American-controlled private contractors employing large numbers of former U.S. military personnel.

By the time Ronald Reagan retired from office, the skids had been greased. The national security bureaucracy was well on its way to embracing a highly militarized conception of how to deal with the challenges posed by the Middle East. Giving Reagan his due requires an appreciation of the extent to which he advanced the reordering of U.S. national security priorities that Jimmy Carter had barely begun. Reagan’s seemingly slapdash Islamic pudding turned out to have a theme after all.

Those who adjudge the present World War IV to be necessary and winnable will see in Reagan’s record much to commend, and may well accord him a share of the credit even for Operations Enduring Freedom and Iraqi Freedom. It was Reagan who restored the sinews of American military might after Vietnam, refashioned American attitudes about military power, and began reorienting the Pentagon toward the Islamic world, thereby making possible the far-flung campaigns to overthrow the Taliban and remove Saddam Hussein. George W. Bush pulled the trigger, but Ronald Reagan had cocked the weapon.

Those who view World War IV as either sinister in its motivation or misguided in its conception will include Reagan in their bill of indictment. From their
perspective, it was he who seduced his fellow citizens with promises of material abundance without limit. It was Reagan who made the fusion of military strength with American exceptionalism the centerpiece of his
efforts to revive national self-
confidence. It was Reagan’s en­thusiastic support of Afghan “freedom fighters”—an eminently defensible position in the context of World War III—that produced not freedom but a Central Asian power vacuum, Afghanistan becoming a cesspool of Islamic radicalism and a safe haven for America’s chief adversary in World War IV. Finally, it was Reagan’s inconclusive forays in and around the Persian Gulf that paved the way for still-larger, if equally inconclusive, interventions to come.

Throughout the first phase of World War IV, from 1980 to 1990, the United States viewed Iran as its main problem and even toyed with the idea that Iraq might be part of a solution. Washington saw Saddam Hussein as someone with whom it might make common cause against the mullahs in Tehran. During the second phase of World War IV, extending through the 1990s, Iraq supplanted Iran as the main U.S. adversary, and policymakers came to see the Iraqi dictator as their chief nemesis.

Various and sundry exertions ensued, but as the U.S. military profile in the region became ever more prominent, the difficulties with which the United States felt obliged to contend also multiplied. Indeed, instead of eliminating Saddam, the growing reliance on military power served only to rouse greater antagonism toward the United States. Actions taken to enhance Persian Gulf stability—more or less synonymous with guaranteeing the safety and survival of the Saudi royal family—instead produced instability.

Phase two of the war began in August 1990, when Saddam Hussein’s army overran Kuwait. From the U.S. perspective, Saddam’s aim was clear. He sought to achieve regional hegemony and to control, either directly or indirectly, the preponderant part of the Persian Gulf’s oil wealth. Were Saddam to achieve those objectives, there was every likelihood that in due course he would turn on Israel.

So after only the briefest hesitation, the administration of George H. W. Bush mounted a forthright response. At the head of a large international coalition, the nation marched off to war, and U.S. forces handily ejected the Iraqi occupiers and restored the Al-Sabah family to its throne. (Bowing to American pressure, Israel stayed on the sidelines.) Its assigned mission accomplished, the officer corps, led by Colin Powell, had little interest in pressing its luck. The American army was eager to scoop up its winnings and go home.

The elder President Bush dearly hoped that Operation Desert Storm might become a great historical watershed, laying the basis for a more law-abiding international system. In fact, the war turned out to be both less and more than he had anticipated. No new world order emerged from the demonstration of American military prowess, but the war saddled the United States with new obligations from which came yet more headaches and complications.

Saddam survived in power by brutally suppressing those whom the Bush administration had urged to rise up in opposition to the dictator. After first averting its eyes from the fate of the Iraqi Shiites and Kurds, the administration eventually found itself shamed into action. To protect the Kurds (and to prevent Kurdish refugees from triggering a military response by neighboring Turkey, a key U.S. ally), Bush sent U.S. forces into northern Iraq. To limit Saddam’s ability to use his army as an instrument of repression, the Bush administration, with British support, declared the existence of “no-fly zones” across much of northern and southern Iraq. In April 1991, Anglo-American air forces began routine combat patrols of Iraqi airspace, a mission that continued without interruption for the next 12 years. During his final weeks in office, Bush initiated the practice of launching punitive air strikes against Iraqi military targets.

Thus, in the year that followed what had appeared to be a decisive victory in Operation Desert Storm, the United States transitioned willy-nilly to a policy that seemed anything but decisive. As a result of that policy, which the Bush administration called “containment,” the presence of substantial U.S. forces in Saudi Arabia and elsewhere in the Persian Gulf, initially conceived as temporary, became permanent. A contingent of approximately 25,000 U.S. troops remained after Desert Storm as a Persian Gulf constabulary—or, from the perspective of many Arabs, as an occupying army of infidels. As a second result of the policy, the United States fell into the habit of routinely employing force to punish the Iraqi regime. What U.S. policymakers called containment was really an open-ended quasi-war.

This new policy of containment-with-bombs formed just one part of the legacy that President Bush bequeathed to his successor, Bill Clinton. That legacy had two additional elements. The first was Somalia, the impoverished, chaotic, famine-stricken Islamic “failed state” into which Bush sent U.S. forces after his defeat in the November 1992 elections. Bush described the U.S. mission as humanitarian, and promised to have American troops out of the country by the time he left office. But when Clinton became president, the troops remained in place. The second element of the legacy Clinton inherited was the so-called peace process, Bush’s post–Desert Storm initiative aimed at persuading the Arab world once and for all to accept Israel.

President Clinton was unable to extract from this ambiguous legacy much of tangible value, though not for want of trying. During his eight years in office, he clung to the Bush policy of containing Iraq while ratcheting up the frequency with which the United States used violence to enforce that policy. Indeed, during the two final years of his presidency, the United States bombed Iraq on almost a daily basis. The campaign was largely ignored by the media, and thus aptly dubbed by one observer “Operation Desert Yawn.”

In the summer of 1993, Clinton had also ratcheted up the U.S. military commitment in Somalia. The results proved disastrous. After the famous Mogadishu firefight of October 1993, Clinton quickly threw in the towel, tacitly accepting defeat at the hands of Islamic fighters. Somalia per se mattered little. Somalia as a battlefield of World War IV mattered quite a bit. The speedy U.S. withdrawal after Moga­dishu affirmed to many the apparent lesson of Beirut a decade earlier: Americans lacked the stomach for real fighting; if seriously challenged, they would fold. That was certainly the lesson Osama bin Laden drew. In his August 1996 fatwa against the United States, he cited the failure of U.S. policy in Lebanon as evidence of America’s “false courage,” and he found in Somalia proof of U.S. “impotence and weaknesses.” When “tens of your soldiers were killed in minor battles and one American pilot was dragged in the streets of Mogadishu,” crowed the leader of Al Qaeda, “you left the area, carrying disappointment, humiliation, defeat, and your dead with you.”

From Mogadishu onward, the momentum shifted inexorably in favor of those contesting American efforts to dominate the gulf. For the balance of the Clinton era, the United States found itself in a reactive posture, and it sustained a series of minor but painful and painfully embarrassing setbacks: the bombing of SANG headquarters in Riyadh in November 1995; an attack on the U.S. military barracks at Khobar Towers in Dhahran in June 1996; simultaneous attacks on U.S. embassies in Kenya and Tanzania in August 1998; and the near-sinking of an American warship, the USS Cole, during a port call at Aden in August 2000.

To each of these in turn, the Clinton administration promised a prompt, decisive response, but such responses as actually materialized proved innocuous. The low point came in late August 1998, after the African embassy bombings. With the United States combating what Bill Clinton referred to as “the bin Laden network,” the president ordered cruise missile strikes against a handful of primitive training camps in Afghanistan. For good measure, he included as an additional target a Sudanese pharmaceutical factory allegedly involved in the production of chemical weapons. Unfortunately for Clinton, the training camps turned out to be mostly empty, while subsequent investigation cast doubt on whether  the factory in Khartoum had ever housed any nefarious activity. Although the president spoke grimly of a “long, ongoing struggle between freedom and fanaticism,” and vowed that the United States was “prepared to do all that we can for as long as we must,” the operation, given the code name Infinite Reach, accomplished next to nothing, and was over almost as soon as it began. The disparity between words and actions—between the operation’s grandiose name and its trivial impact—spoke volumes. In truth, no one in the Clinton White House had a clear conception of what the United States needed to do—or to whom.

Finally, despite Clinton’s energetic and admirable contributions, the peace process failed to yield peace. Instead, the collapse of that process at Camp David in 2000 gave rise to a new cycle of Palestinian terrorist attacks and Israeli reprisals. An alienated Arab world convinced itself that the United States and Israel were conspiring to humiliate and oppress Muslims. Just as the Israeli Defense Forces occupied Gaza and the West Bank, so too did the U.S. military seemingly intend to occupy the Middle East as a whole. In Arab eyes, the presence of U.S. troops amounted to “a new American colonialism,” an expression of a larger effort to “seek control over Arab political and economic affairs.” And just as Israel appeared callous in its treatment of the Palestinians, so too did the United States seem callous in its attitude toward Iraqis by persisting in a policy of sanctions that put the burden of punishment not on Saddam Hussein but on the Iraqi people.

The end of the 1980s had found the Reagan administration engaged in a far-reaching contest for control of the Middle East, a de facto war whose existence Reagan himself either could not see or was unwilling to acknowledge. Ten years later, events ought to have removed any doubt as to whether the circumstances facing the United States qualified as a war, but the Clinton administration’s insistence on describing the adversary as disembodied “terrorists” robbed those events of any coherent political context. In the manner of his immediate predecessors, Clinton refused to concede that the violence directed against the United States might stem from some plausible (which is not to imply justifiable) motivation—even as Osama bin Laden outlined his intentions with impressive clarity. In his 1996 declaration of jihad, for example, bin Laden identified his objectives: to overthrow the corrupt Saudi regime that had become a tool of the “Zionist-Crusader alliance,” to expel the infidels from the land of the Two Holy Places, and to ensure the worldwide triumph of Islam. But his immediate aim was more limited: to destroy the compact forged by President Roosevelt and King Ibn Saud. A perfectly logical first step toward that end was to orchestrate a campaign of terror against the United States.

For Clinton to acknowledge bin Laden’s agenda was to acknowledge as well that opposition to the U.S. presence in and around the Persian Gulf had a history, and that, like all history, it was fraught with ambiguity. In the Persian Gulf, the United States had behaved just like any other nation, even as it proclaimed itself democracy’s greatest friend. For decades it had single-mindedly pursued its own interests, with only occasional regard for how its actions affected others. Expediency dictated that American policymakers avert their eyes from the fact that throughout much of the Islamic world the United States had aligned itself with regimes that were arbitrary, corrupt, and oppressive. The underside of American exceptionalism lay exposed.

In the annals of statecraft, U.S. policy in the Persian Gulf from FDR through Clinton did not qualify as having been notably harsh or irresponsible, but neither had it been particularly wise or enlightened. Bin Laden’s campaign, however contemptible, and more general opposition to U.S. ambitions in the greater Middle East, developed at least in part as a response to earlier U.S. policies and actions, in which lofty ideals and high moral purpose seldom figured. The United States cannot be held culpable for the maladies that today find expression in violent Islamic radicalism. But neither can the United States absolve itself of any and all responsibility for the conditions that have exacerbated those maladies. After several decades of acting as the preeminent power in the Persian Gulf, America did not arrive at the end of the 20th century with clean hands.

Years before 9/11, bin Laden understood that World War IV had been fully joined, and he seems to have rejoiced in the prospect of a fight to the finish. Even as they engaged in an array of military activities intended to deflect threats to U.S. control of the Persian Gulf and its environs, a succession of American presidents persisted in pretending otherwise. For them, World War IV remained a furtive enterprise.

Unlike Franklin Roosevelt, who had deceived the American people but who understood long before December 7, 1941, that he was steadily moving the United States toward direct engagement in a monumental struggle, the lesser statesmen who inhabited the Oval Office during the 1980s and 1990s, in weaving their deceptions, managed only to confuse themselves. Despite endless assertions that the United States sought only peace, Presidents Reagan, Bush, and Clinton were each in fact waging war. But a coherent strategy for bringing the war to a successful conclusion eluded them.

Even as it flung about bombs and missiles with abandon, the United States seemed to dither throughout the 1990s, whereas bin Laden, playing a weak hand, played it with considerable skill. In the course of the decade, World War IV became bigger and the costs mounted, but its resolution was more distant than ever. The Bush and Clinton administrations used force in the Middle East not so much as an extension of policy but as a way of distracting attention from the contradictions that riddled U.S. policy. Bombing something—at times, almost anything—became a convenient way of keeping up appearances. Thus, despite (or perhaps because of) the military hyperactivity of the two administrations, the overall U.S. position deteriorated even further during World War IV’s second phase.

George W. Bush inherited this deteriorating situation when he became president in January 2001. Bush may or may not have brought into office a determination to finish off Saddam Hussein at the first available opportunity, but he most assuredly did not bring with him a comprehensive, ready-made conception of how to deal with the incongruities that plagued U.S. policy in the greater Middle East. For its first eight months in office, the second Bush administration essentially marked time. Apart from some politically inspired grandstanding—shunning an international agreement to slow global warming, talking tough on North Korea, accelerating plans to field ballistic missile defenses—Bush’s foreign policy before 9/11 hewed closely to the lines laid down by his predecessor. Although Republicans had spent the previous eight years lambasting Clinton for being weak and feckless, their own approach to World War IV, initially at least, amounted to more of the same.

Osama bin Laden chose this moment to begin the war’s third phase. His direct assault on the United States left thousands dead, wreaked havoc with the American economy, and exposed the acute vulnerabilities of the world’s sole superpower.

President Bush’s spontaneous response to the events of 9/11 was to see them not as vile crimes but as acts of war. In so doing, he openly acknowledged the existence of the conflict in which the United States had been engaged for the previous 20 years. World War IV became the centerpiece of the Bush presidency, although the formulation preferred by members of his administration was “the global war on terror.”

When committing the United States to large-scale armed conflict, presidents have traditionally evinced a strong preference for explaining the stakes in terms of ideology, thereby distracting attention from geopolitics. Americans ostensibly fight for universal values rather than sordid self-interest. Thus, Franklin Roosevelt cast the war against Japan as a contest that pitted democracy against imperialism. The Pacific war was indeed that, but it was also a war fought to determine the future of East Asia, with both Japan and the United States seeing China as the main prize. Harry Truman and his successors characterized the Cold War as a struggle between a free world and a totalitarian one. Again, the war was that, but it was also a competition to determine which of two superpowers would enjoy preponderant influence in Western Europe, with both the Soviet Union and the United States viewing Germany as the nexus of conflict.

During its preliminary phases—from January 1980 to September 2001—World War IV departed from this pattern. Regardless of who happened to be occupying the Oval Office, universal values did not figure prominently in the formulation and articulation of U.S. policy in the Persian Gulf. Geopolitics routinely trumped values in the war. Everyone knew that the dominant issue was oil, with Saudi Arabia understood to be the crown jewel. Only after 9/11 did values emerge as the ostensible driving force behind U.S. efforts in the region—indeed, throughout the greater Middle East. On September 11, 2001, World War IV became, like each of its predecessors, a war for “freedom.” To this theme President George W. Bush has returned time and again.

In fact, President Bush’s epiphany was itself a smoke screen. His conversion to the church of Woodrow Wilson left substantive U.S. objectives in World War IV unaltered. Using armed might to secure American preeminence across the region, especially in the oil-rich Persian Gulf, remained the essence of U.S. policy. What changed after 9/11 was that the Bush administration was willing to pull out all the stops in its determination to impose America’s will on the greater Middle East.

In that regard, the administration’s invasion of Iraq in March 2003 can be said to possess a certain bizarre logic. As part of a larger campaign to bring the perpetrators of 9/11 to justice, Operation Iraqi Freedom made no sense at all and was probably counterproductive. Yet as the initial gambit of an effort to transform the entire region through the use of superior military power, it not only made sense but also held out the prospect of finally resolving the incongruities bedeviling U.S. policy. Iraq was the “tactical pivot”—not an end in itself but a way station. “With Saddam gone,” former counter-terrorism official Richard Clarke has written in Against All Enemies (2004), “the U.S. could reduce its dependence on Saudi Arabia, could pull its forces out of the Kingdom, and could open up an alternative source of oil.”

Pulling U.S. forces out of Saudi Arabia did not imply removing them from the region; a continuing American troop presence was necessary to guarantee U.S. access to energy reserves. But having demonstrated its ability to oust recalcitrants, having established a mighty striking force in the center of the Persian Gulf, and having reduced its susceptibility to the oil weapon, the United States would be well positioned to create a new political order in the region, incorporating values such as freedom, democracy, and equality for women. A Middle East pacified, brought into compliance with American ideological norms, and policed by American soldiers could be counted on to produce plentiful supplies of oil and to accept the presence of a Jewish state in its midst. “In transforming Iraq,” one senior Bush administration official confidently predicted, “we will take a significant step in the direction of the longer-term need to transform the region as a whole.”

Bush and his inner circle conceived of this as a great crusade, and, at its unveiling, a clear majority of citizens also judged the preposterous enterprise to be justifiable, feasible, and indeed necessary. At least two factors help to explain their apparent gullibility.

The first is self-induced historical amnesia. Shortly after 9/11, Deputy Secretary of State Richard Armitage growled that “history starts today.” His sentiment suffused the Bush administration and was widely shared among the American people. The grievous losses suffered in the attacks on the World Trade Center and the Pentagon had rendered irrelevant all that went before—hence the notable absence of interest among Americans in how the modern Middle East had come into existence, or in the role the United States had played since World War II in its evolution. The events of 9/11 wiped the slate clean, and on this clean slate the Bush administration, in quintessential American fashion, fancied that it could begin the history of the greater Middle East all over again.

There is a second explanation for this extraordinary confidence in America’s ability to reorder nations according to its own preferences. The progressive militarization of U.S. policy since Vietnam—especially U.S. policy as it related to the Middle East—had acquired a momentum to which the events of 9/11 only added. The aura that by 2001 had come to suffuse American attitudes toward war, soldiers, and military institutions had dulled the capacity of the American people to think critically about the actual limits of military power. And nowhere had those attitudes gained a deeper lodgment than in the upper echelons of the younger Bush’s administration. The experiences of the previous 30 years had thoroughly militarized the individuals to whom the president turned in shaping his global war on terror, formulating grand statements, such as his National Security Strategy of the United States of America, and planning campaigns, such as the invasions of Afghanistan and Iraq. Theirs was a vision, writes James Mann in The Rise of the Vulcans (2004), of “a United States whose military power was so awesome that it no longer needed to make compromises or accommodations (unless it chose to do so) with any other nation or groups of countries.”

As the epigraph to his book Why We Were in Vietnam (1982), Norman Podhoretz chose a quotation from Bismarck: “Woe to the statesman whose reasons for entering a war do not appear so plausible at its end as at its beginning.” For the architects of the global war on terror—George W. Bush, Dick Cheney, Donald Rumsfeld, Condoleezza Rice, and Paul Wolfowitz—it’s too late to heed the Iron Chancellor’s warning. But the outsized conflict that is their principal handiwork continues.

As this is written, the outcome of World War IV hangs very much in the balance. American shortsightedness played a large role in creating this war, and American hubris has complicated it unnecessarily, emboldening the enemy, alienating old allies, and bringing U.S. forces close to exhaustion. Yet like it or not, Americans are now stuck with their misbegotten crusade. God forbid that the United States should fail, allowing the likes of Osama bin Laden and his henchmen to decide the future of the Islamic world.

But even if the United States ultimately prevails, the prospects for the future will be no less discouraging. On the far side of World War IV, a time we are not now given to see, there wait others who will not readily concede to the United States the prerogatives and the dominion that Americans have come to expect as their due. The ensuing collision between American requirements and a noncompliant world will provide the impetus for more crusades. Each will be justified in terms of ideals rather than interests, but the sum of them may well doom the United States to fight perpetual wars in a vain effort to satisfy our craving for limitless freedom.

More From This Issue