The End of the West as We Know It

The End of the West as We Know It

For the past 500 years or so, world politics has mostly been driven by the actions and priorities of the transatlantic powers (aka "the West"). This era began with the development of European colonial empires, which eventually carved up most of the globe, spread ideas like Christianity, nationalism and democracy, and created many of the state boundaries that still exist today. (They also screwed a lot of things up in the process). Although other actors (e.g., Japan) played significant roles too, especially after 1945, the transatlantic community (broadly defined) had been the most important set of players for centuries.

Europe's decline after World War II was immediately followed the era of American liberal internationalism. With NATO and Japan as junior partners, the United States underwrote a variety of global institutions (mostly of its own making), maintained a vast array of military bases, waged and won a Cold War, and sought-with varying degrees of enthusiasm and success-to spread core "Western" values and institutions to different parts of the world.

Read Full Article »
Comment
Show commentsHide Comments

Related Articles