QUESTION OF THE DAY: Did 9/11 change America for better or worse?
The terror attacks on Sept. 11, 2001, changed the United States forever.
In the aftermath of the attacks, the country seemed united, even though that wouldn't last. The attacks also ushered in security enhancements that have prevented another large-scale attack from happening.
But 9/11 also led to the War on Terror, including a 20-year conflict in Afghanistan that ended with the Taliban retaking the country. Many also blame the aftermath of 9/11 for sowing the seeds of our current political divisions and militarizing the police.
Did 9/11 change our country for better or worse? Vote in the poll below.