My first memory of that day is of the sky. It was clear and bright, and as I walked along de Terrebonne Street to Concordia University’s Loyola campus, I marveled at the deepness of the blue.
I could still see the sky outside the windows of the computer lab in the campus’s Central Building, where I taught what we then called Computer-Assisted Reporting. I was discussing the vast efflorescence of Internet news site, and how to distinguish reliable from unreliable sources. My students dutifully followed along at their blueberry iMacs.
About fifteen minutes into the class, one student, in the third-row and slightly to my left, looked up and said “a plane just crashed into the World Trade Center.” All eyes turned to the screens, where we saw smoke billowing from the WTC’s North Tower in real-time on every Internet news site.
It must have been an accident, I thought, like when USAAF Lieutenant Colonel William F. Smith crashed his B-25 Mitchell bomber into the Empire State Building in the waning days of the Second World War. The building stood so high, and Col. Smith had been disoriented by a thick fog. But that did not make sense; 11 September 2001 was a beautiful, clear, sunny day.
Eager to take advantage of a rare “teaching moment,” I tossed my lecture notes aside and directed my students to “get to the bottom” of the emerging news story. Minutes later, a second plane plunged into the South Tower. This was no accident; the full enormity of that morning became suddenly clear.
Time seemed suspended as I looked out at my class of eager, young journalism students. I saw fear, and horror, and incomprehension and I realized, as the growing sounds of commotion in the corridor leaked through the computer lab doors, that I had to say something. I gathered myself and said in what I remember to be an uncharacteristically soft voice (not my “teacher voice”): “always remember this moment. The world has changed.” I dismissed my class.
I plunged into the chaos of the corridor, where students in every other class had been let go at almost the same moment. My colleague Linda Kay, a Pulitzer Prize-winning reporter who had been teaching the feature-writing workshop in the classroom next to mine, stood at the base of the steps leading out the building’s door to the bright sunshine. Linda was what they called a “tough cookie” when she had begun her journalistic career in the 1970s, but I had never seen he so shaken.
“What just happened?” she asked rhetorically, knowing as well as I did what had happened. I said nothing. Linda was a native New Yorker, and you could still hear the tones of Flatbush Avenue in her voice after two decades in Montreal. “It’s so unreal…” I said something, doubtless insufficient to the moment, which I cannot recall today; we hugged and went our separate ways out the door into a changed world. There would be no more classes today.
I headed home, where I watched the flames belching from the twin towers before they collapsed in smoke and debris, repeated in an endless recurring video loop on cable news. I called Alan and we said very little, sharing a shocked silence for about 15 minutes. I called friends in New York, and my body turned cold when I could not reach Ron, my friend and editor at NetGuide magazine, for most of the day.
As I had walked home along de Terrebonne Street, all I could think about was that it was an excruciatingly beautiful late-summer day, and I marveled at the deep-blue of the sky. It was, the performance artist Karen Finley would later reflect, “such a beautiful day for a tragedy.”
***
11 September 2001 is an abrupt caesura in history, a temporal rupture from which we divide all that came after from all that came before even more profoundly than the assassination of President John F. Kennedy. The world changed; but the ways that it changed were unexpected and, it seems, irrevocable.
“How close were you?” the question repeated ad-infinitum to anyone who was in the city that day. “Do we have to make this story sadder?” Finley asked, channeling the persona of the archetypal New Yorker in a performance of Make Love that I saw at a small theater just off Times Square seven years after the WTC attack. “How close were you?” The show marked a break in Finley’s work as jarring as the day that it tries to make sense of in its rubble-strewn aftermath:
Somewhere here is a message.
Somewhere here is a metaphor.
Somewhere here is a cosmic web that makes sense of a terrible time.
What sense there was, was fear. “The real true fear after the attack is repressed fear,” Finley intoned from the stage, “personal private fear emotionally reawakened or generational emotional fear legacies finally manifested with this horror.”
It can be difficult today to fully recall what it was like to live before 9/11, the casual, graceful sense of security that suffused the brief, bright summer of the decade following the end of the Cold War. It was all a myth, of course: peace and security both inaugurated and supported by the vast expansion of consumer capitalism. There was much to fear at the end of the 20th century, from dire warnings of impending environmental disaster, to rising drumbeats of white nationalist hate on the Internet, and to the very neoliberal free-market – what the economist Joseph Stiglitz would ruefully call “that sacred article of faith” – that sponsored our consumer paradise, but at a cost.
There were persistent warnings of violence throughout the decade leading up to 9/11. In 1993, Islamist terrorists detonated a car bomb in the parking garage of the World Trade Center, killing six, and injuring hundreds; months later, 79 people died in the botched government siege of a Christian extremist compound in Waco Texas which, together with the debacle at Ruby Ridge the previous year, mobilized a generation of anti-government fanatics in the United States. One of them, Timothy McVeigh, murdered 168 people in Oklahoma City in 1995.
Less than a year before 9/11, two suicide bombers blew a hole in the hull of the USS Cole off the coast of Yemen. The New York Times reported that the FBI was keeping open an alternate possibility “that the attack may have involved powerful figures inside Yemen with close ties to Osama bin Laden,” the mastermind of the 1998 embassy bombings in Nairobi and Dar es Salaam. The name of Al-Qaeda had first entered American consciousness in the reporting of those attacks but, in 2000, it had still not become an incantation of terror.
These were discrete, isolated events in our shared consciousness, not threads woven together in a persistent tapestry of fear. The terrorists of our cultural imagination spoke more often in the Irish brogue of an IRA gunman (Patriot Games, The Devil’s Own), or in the Eastern European accent of former communist agents (Air Force One, Die Hard With a Vengeance) interrupting the peace in the post-Cold War, post-ideology world. They served to remind us of the blessings of late-20th century, free-market bounty and the myths of global geopolitical tranquility. When Muslims appeared at all they were, often enough, our heroic Kuwaiti allies, or noble refugees grateful for our protection from Saddam Hussein’s henchmen (Three Kings).
Our racism has always been there – remember that McVeigh was able initially to slip through the law enforcement dragnet because the FBI and police automatically assumed that the Oklahoma City bomber was a Muslim terrorist – but we had not yet begun to live in a state of persistent, atavistic fear of the Islamic other. Security checkpoints, body scans, and 100ml bottles had not yet become the constant reminder that we must be afraid – “generational emotional fear legacies finally manifested.”
***
The uncanny alterity of that time is reinforced in my memory by the absence of social media. We didn’t send tweets and Facebook messages on that beautiful sunny morning. Our connections to each other were point-to-point – frenzied telephone calls or, less-often, emails and AOL instant messages that we knew would only reach a computer on someone’s desk. We obsessively watched the 24-hour news cycle, limited to whatever sources came through to our living rooms, carried by cable television providers.
After the initial, abrupt shock, the story unfolded at a pace that most of us today would find leisurely. CNN, CBC Newsworld, MSNBC, CTV Newsnet, Fox News, and all the rest filled the hours with that endless loop of fire and smoke, punctuated with pious pronouncements and President George W. Bush’s address to the nation and the world later that night. Riding my bike to find some escape on the Lakeshore Road near Montreal’s Dorval International Airport the following day, the absence of aircraft sounds was oppressive. All North American air traffic had been grounded, and the world seemed suspended in nothingness.
The story came out slowly and along with it, the true depth the dark transformation of our lives. On 12 September, President Bush declared that his government would “make no distinction between the terrorists who committed these acts and those who harbor them.” Although the New York Times reported that “Afghanistan and administration officials insisted there was no hard evidence to connect Mr. bin Laden to today’s attacks,” or to Afghanistan, the President demanded that the Taliban turn over the Al-Qaeda leader to American justice.
It is easy to forget, after two decades of war, that the Taliban rulers of Afghanistan had said that they would be willing to extradite Osama bin-Laden if the United States could provide evidence that he was responsible for the attacks or that, a few days later, the country’s one thousand Mullahs issued a joint fatwa expressing grief for the deaths in the attacks and their determination to press the government to convince the Al-Qaeda leader to give himself up. We might speculate on their sincerity, but President Bush had issued an ultimatum, and Americans were in no mood to wait even days.
The president did not want to look weak; he could not afford to. Congress had passed the Authorization for Use of Military Force, with one dissenting vote, on 14 September, and it was signed into law on the very day that Afghanistan’s Mullahs had met. He dared not squander the upswelling support from the United States’ allies as he assembled a military coalition for the crusade against Afghanistan, the Taliban and, almost incidentally, Osama bin-Laden and Al-Qaeda. By the last week of September American and British forces had “boots on the ground;” the crusade began in earnest on Sunday, 7 October with airstrikes and cruise missiles.
I had seen the rush to war before – in 1991, when the first President George Bush led a multinational coalition backed with the authority of the United Nations to defend the perquisites of the Kuwaiti royal family – and it made me queasy then. But I had never really seen war fever, “the wild intoxication that mingled alcohol with the joy of self-sacrifice, a desire for adventure and sheer credulity, the old magic of the banners and patriotic speeches,” that Stefan Zweig wrote of 1914, “an uncanny frenzy that eludes verbal description but is capable of affecting millions…”
Thousands rushed to the colors in the fall of 2001 amid rhetoric that seemed like the surreal cosplay of Zweig’s war. In the fall of 1914, weeks after sending her sons to the front, the German artist Käthe Kollwitz remarked on a patriotic article she had read in the newspaper. “She spoke of the joy of sacrificing – a phrase that struck me hard. Where do all the women who have watched so carefully over the lives of their loved ones get the heroism to face the cannon? I am afraid that this soaring of the spirit will be followed by the blackest despair and dejection.”
The Washington Post sounded an eerily similar note in its 13 September editorial. The War on Terror, the newspaper noted “will require a major realignment of resources and priorities by the Bush administration, bipartisan support in Congress, and a national commitment by a society prepared to make sacrifices.” Four months later, in his State of the Union Address, President Bush made that vision official. “In the sacrifice of soldiers, the fierce brotherhood of firefighters, and the bravery and generosity of ordinary citizens,” he declared, “we have glimpsed what a new culture of responsibility could look like.”
The “new culture,” whether or not it was a culture of responsibility, was rhetorically reproduced millions of times over with each repetition of “thank you for your service” when a civilian encountered a serving soldier or veteran from pretty much any of America’s wars. Service in uniform became the only service worth celebrating; we pay lip-service to medical professionals at the front lines of the pandemic crisis, but we offered priority service and fawning tribute to men and women in uniform. “The soldier’s coat is represented as the most distinguished of all coats,” Karl Liebknecht wrote in 1917, “The soldier’s honor is lauded as being of special excellence, and the soldier’s status is trumpeted forth as the most important and distinguished and is indeed endowed with many privileges.”
Military service and, above all, military sacrifice has become the essential qualification to have a voice in the post-9/11 republic. It has been this way before; both Kennedy and Richard Nixon’s early political careers were built on the foundation of their service in the US Navy during World War II. But 14 million Americans served in that war, and even if service had not been universal, that nonetheless represented 10 percent of the population. In contrast, since 2001, some 2.7 million Americans, or one percent of the population, have served on active deployment. A military record is comparatively rare and exotic and, in our new militaristic age, it became a ticket to the front of the line.
It was not the incisiveness of their ideas, nor the content or quality of their words that brought Khizr and Ghazala Khan to the rostrum of the 2016 Democratic National Convention, but the death of their son while serving in the uniform of the United States Army. As different as they are, neither Pete Buttigieg nor Tom Cotton could have dreamed of national political prominence so soon – the former as a presidential candidate, and the latter as a senator widely tipped as a presidential possibility in the near future – without their military service. Americans have learned to respect the uniform above all, and the uniform opens doors.
Two decades later, our enthusiasm for all things military has not faded; camouflage is a fashion statement, the AR-15 is a symbol of goose-stepping patriotism, our sports teams celebrate Armed Forces Days events… and woe betide any public figure who fails to step in line and “respect the troops.” Showing anything less than perfect deference to the flag – now explicitly signifying “the troops” and military glory – invites widespread condemnation, social death, and career suicide.
***
The American militarism that emerged and was subsequently institutionalized after 2001 relies on a culture of constant, inchoate, existential terror cultivated in our media and in every one of our interactions with state authority. I remember waiting in the Staten Island Ferry Terminal at the tip of Manhattan in 2005, looking up at a large billboard over the gate displaying the current “threat level.” The Bush administration implemented the color-coded Homeland Security Advisory system three years before, but I could not understand why I needed to know, as I embarked for an Italian dinner, that the threat level was yellow: “Significant risk of terrorist attacks.”
We needed to know because the new America demanded that we remain afraid. “If you see something, say something,” the posters exhorted from the subway car walls, and the stations were invariably patrolled by grim-faced men and women in battle dress and armed with assault rifles. Passing through security checkpoints in airports, at the entries of government buildings, theaters, museums and libraries, and often even boarding public transit. It became a daily fact of life.
The “shoe bomber,” the “underwear bomber,” and the 2006 liquid explosives plot at London’s Heathrow Airport justified increasingly restrictive and invasive procedures at airport security checkpoints, but they were hardly the reason for them. It might have been worth noting that all three attempts failed, and that implementing security measures after the fact could hardly have prevented them. But walking barefoot to a full-body security scanner where a TSA agent takes a close look at our genitals is a reminder of our collective insecurity and a potent catalyst of fear.
Constant, buzzing, pervasive fear has been the backdrop of American life for the last-20 years, perversely encouraging us to accept ever-greater intrusions of the security state’s surveillance apparatus that conversely remind us that we must be afraid. Walking around downtown Manhattan today or, for that matter, any major American city, we have grown accustomed to ignoring the police surveillance cameras and, increasingly, drones that have proliferated at every street corner and public gathering area. When we do notice them at all, they remind us to be afraid.
The expansion of the War on Terror to include an imperialist crusade in Iraq in 2003 met initially with less enthusiastic support. There were antiwar demonstrations around the world an in the United States, and even the usually-bellicose New York Times opined that war with Iraq was “neither foreordained nor inevitable,” while at the same time noting that it was “a war for a legitimate international goal against an execrable enemy.”
Polls showed that only 57 percent of Americans favored war with Iraq in late January 2003, but then in his State of the Union Address in 28 January, President Bush sounded a note of fear that won Americans over. “Today, the gravest danger in the war on terror, the gravest danger facing America and the world, is outlaw regimes that seek and possess nuclear, chemical and biological weapons,” he said. “These regimes could use such weapons for blackmail, terror and mass murder. They could also give or sell those weapons to terrorist allies, who would use them without the least hesitation.”
You could feel a chill of dread run through the American body politic. Administration officials had been flogging the myth of “weapons of mass destruction” to drum up international support for the war for many weeks, but President Bush had articulated it in language leveraging the still-fresh horror of the bright morning of 9/11. In the first polls after the president’s speech, 66 per cent of Americans said that they approved of war with Iraq and by the time US Marine Expeditionary Force crossed the border of Kuwait, that number had risen to 75 percent.
Success bred enthusiasm and the day after President Bush strode manfully across the flight deck of the USS Abraham Lincoln off the coast of California to announce “mission accomplished,” even the initially diffident Times editorial board had come around. “America’s armed forces performed courageously in Iraq, dislodging a brutal dictatorship in a swift, decisive campaign,” they wrote. “They deserve the nation’s thanks and a warm welcome home.”
The lesson many Americans took from Iraq was that success legitimizes all action and, ominously, that the ends always justify the means. Despite persistent doubts about weapons of mass destruction, and the threadbare justification of jus ad bellum that had deterred all but the closest US allies from joining President Bush’s “Coalition of the Willing,” the United States was the undisputed champion of the world. It is possible to note a dark shift in American culture from that point, a celebration of brutality and power for their own sakes.
We have always celebrated our military heroes in mass culture – indeed, Audie Murphy and Alvin York became the defining images of American courage following their wars. However, their heroism was always expressed – and received – in the language of honor and decency. Even the antiheroes of The Dirty Dozen were ultimately ennobled by their admittedly grudging commitment to a higher ideal. But what emerged from the endless War on Terror was something much different.
War has become mundane in our cultural imagination; survival devoid of ideals and values, a job that must be done for no other reason than the job itself. We praise the skill of a cold-blooded assassin in American Sniper, and celebrate torturers and war criminals in Zero Dark Thirty and Unthinkable because they get results and, at the end of the day, the murderers and criminals are our criminals. The morality of means has nothing to do with it.
In our brutalized culture, fear justifies everything; the threat is so terrifying that any means are reasonable. It is a lesson to some of our political leaders have learned all-too-well.
***
It took a while – a few days, or a few weeks – for the keening bloodlust that came in the wake of 9/11 to make a full impression on my consciousness. There were no memes or ads filling my social media feed, only overheard snippets of comment and conversation at first, and then the boldfaced headlines of the news commentariat demanding retribution. “We should bomb those fuckers back to the stone age,” I overheard more than once, leading me to conclude that the phrase had initially appeared in the media. “They’re already in the stone age,” I heard someone reply, “let’s bomb them to hell.”
I was deeply troubled by the war but, even more than that, I was sickened by the shroud of unthinking hate and rage that had descended on the world. An editor at a major publication spiked a feature article that I had submitted on Palestinian human rights following the Durban Conference. “No one gives a shit about Arabs now,” he said. A few weeks after the Twin Towers fell, I attended a dinner party in the company of a well-known commentator for a major Canadian news organization whose bellicose posturing had burnished his celebrity.
I asked him, as a journalistic colleague, if he really meant all of that, or if it was all just a pose. “I have to tone it down for print he said,” repeating the slogan about bombing Afghanistan back to the stone age. “The ragheads are fucking savages,” he said. “We should slaughter every last one of them.” The other guests nodded and voiced their approval; I protested weakly. The hosts never invited me back for dinner.
That kind of explicit, unfiltered bigotry became the keynote of the next two decades, resonating in the nativist rhetoric of populist politicians in the United States, Canada, Great Britain, France, and elsewhere, in our shameful collective disregard for refugees from the geopolitical instability the post-9/11 crusade unleashed, and in increasingly unrestrained public displays of hate.
The Eastern European and Irish terrorists mostly disappeared from popular culture, to be replaced by dark-eyed, dark-skinned Middle Easterners with names like Abu Nazir (Homeland), Raza (Iron Man), and Yusuf Atta Mohammed (Unthinkable). There have been other fanatics in popular culture over the last two decades, to be sure, including crazed North Koreans and even the occasional neo-Nazi. After 9/11, however, Mediterranean features and an Arabic accent have been deployed repeatedly, and ad-nauseum, as the preferred codes for “bad guy.” And those “bad guys” were most explicitly not us, but the ominous other external to the American body politic.
The day the World Trade Center towers fell, an aircraft mechanic in Mesa, Arizona told his friends that he was “going to go out and shoot some towel-heads.” Four days later, he drove to a local gas station and murdered its owner Balbir Singh Sodhi, a Sikh immigrant from India. Hate crimes against people of Middle Eastern and South Asian descent have become so common in our brutalized society that, when Isaiah Peoples, on the way to his Bible class, drove his truck into a group of Indian teenagers in 2019, we were all shocked, but not surprised.
The narrative that emerged after 9/11 is that Muslims are not, and cannot be Americans. Political demagogues and their enablers in the media have reminded us over and over that, in the words of Oklahoma law maker John Bennet, “Islam is not even a religion; it is a political system that uses a deity to advance its agenda of global conquest,” that many Americans have begun to believe it.
This kind of unthinking bigotry scuttled plans to build an Islamic cultural center at 51 Park Place in Manhattan a decade after 9/11. I remember the hordes of righteous protesters who flooded into New York mobilized against what they called the “Ground Zero Mosque.” Muslim Americans, the racist firebrands and their acolytes claimed, had no right to build on “hallowed ground” – what was then, in fact, an economically stagnant neighborhood that was home to strip clubs and offtrack betting locations. It would be comforting to believe that this denial of the legitimacy of Muslim citizenship is an artifact of a darker past, but one need only point to how Ilhan Omar’s headscarf and Rashida Tlaib’s name are today used to discredit their politics.
It has become almost a cliché to say that America lost its innocence when Lee Harvey Oswald murdered President Kennedy almost 60 years ago. The horrific events of that beautiful, sunny day in Dallas mark a decisive change in our collective memory, when the vulnerability of our political leaders and institutions became manifest, and the “brief, shining moment” of the Camelot myth was extinguished. After that, the narrative says, came Vietnam, Watts, and the tragic murders in Memphis and Los Angeles.
But we lost more on 9/11, and not just Americans, but the world; we lost our humanity. What came out of that clear, blue sky was not merely a rupture in time, but the end of something that – after twenty years, unending violence, uncountable deaths, and a culture of normalized brutality – now seems irrevocable.