Europe Is Turning Into a Right-Wing Continent

For a certain kind of American liberal, it's almost a reflexive gesture to wish the United States were more like Europe. There, health care is provided on a more egalitarian basis and a university education is much cheaper, if not free; sexual mores are more relaxed and gun ownership is rare; religion is vestigial and militant nationalism is strictly taboo. Widespread European distress over the presidencies of George W. Bush and Donald Trump only confirmed what American liberals knew: that the Old Country was also the dreamland of their imagined liberal American future.

 

Read Full Article »




Related Articles