X
Story Stream
recent articles

When did the American empire start to decline/

china%20modern.jpg

Stephen Walt wonders when it was the U.S. empire started to decline. His answer: the first Gulf War. Here's the rationale:

Unfortunately, the smashing victory in the first Gulf War also set in train an unfortunate series of subsequent events. For starters, Saddam Hussein was now firmly identified as the World's Worst Human Being, even though the United States had been happy to back him during the Iran-Iraq War in the 1980s. More importantly, the war left the United States committed to enforcing "no-fly zones" in northern and southern Iraq.

But even worse, the Clinton administration entered office in 1993 and proceeded to adopt a strategy of "dual containment." Until that moment, the United States had acted as an "offshore balancer" in the Persian Gulf, and we had carefully refrained from deploying large air or ground force units there on a permanent basis.

I think if we're going to pin the blame for a deepening U.S. role in the Middle East on anything it wouldn't be the Gulf War but the Carter Doctrine - that was what put the U.S. on the path toward an interventionist posture in the region. The Gulf War and the dual containment that followed were in many ways the logical heir to that doctrine.

But I'm not convinced that the Gulf War is really responsible, per se, for U.S. decline, mostly because "decline" is more of a relative phenomena (although we certainly haven't helped ourselves of late). That being the case, I'd argue that Deng Xiaoping's market-oriented reforms in China, which kicked off three decades of economic growth, have probably played a much more significant role in the narrowing of the power gap than America's post-Gulf War blunders.

(AP Photo)