Did 9/11 Change the United States?


On Sept. 12, 2001, Americans awoke to a world that appeared forever altered. The morning before, the United States had been attacked for the first time on its own soil since Pearl Harbor. Within days, U.S. President George W. Bush would declare a “war on terror.” Analysts quickly made dramatic predictions about how the United States would change as a result, from an expanded security state to radicalization within the country to the end of irony. Some pundits turned out to be correct; others, woefully off base.



The 9/11 era is in the rearview mirror: In the last 20 years, a generation has grown up with only a collective memory of the attacks, and the United States has now withdrawn from Afghanistan. But some shifts were permanent. Foreign Policy asked seven of our columnists and contributors to weigh on how 9/11 did reshape U.S. foreign and domestic policy—and what it means for the future.


The U.S. relationship with the Arab and Muslim world will never be the same.


By Mina Al-Oraibi, FP columnist and the editor in chief of the National


The 9/11 attacks forever changed the U.S. relationship with the Arab and Muslim world and have defined them for the past two decades. The terrible events of Sept. 11, 2001, shifted relationships based on energy security, bilateral interests, and the maintenance of Israel’s military supremacy and made them largely about the goal of countering Islamist terrorism.


In the second half of the 20th century, U.S. alliances with Arab and Muslim-majority countries were based on whether they fell under U.S. or Soviet influence. After 9/11, U.S. policy toward the Arab and Muslim world became based on the principle of guilty until proven i ..

Support the originator by clicking the read the rest link below.