1.
During the spring and summer of 2004 some Americans, most but not all of them nominal Democrats, spoke of the November 2 presidential election as the most important, or “crucial,” of their lifetimes. They told not only acquaintances but reporters and political opinion researchers that they had never been more “concerned,” more “uneasy,” more “discouraged,” even more “frightened” about the future of the United States. They expressed apprehension that the fragile threads that bound the republic had reached a breaking point; that the nation’s very constitution had been diverted for political advantage; that the mechanisms its citizens had created over two centuries to protect themselves from one another and from others had been in the first instance systematically dismantled and in the second sacrificed to an enthusiasm for bellicose fantasy. They downloaded news reports that seemed to make these points. They e-mailed newsletters and Web logs and speeches and Doonesbury strips to multiple recipients.
These Americans had passed the point of denying themselves broad strokes. They kept one another posted on loosened regulations benefiting previously obscure areas of the economy, for example snowmobile manufacture. They knew how many ringneck pheasants Vice President Cheney and his party had brought down during a morning’s stocked shoot at the Rolling Rock Club in Ligonier Township, Pennsylvania: 417, of the 500 released that morning. They collected the vitae of Bush family associates named on the Web site of New Bridge Strategies, LLC, “a unique company that was created specifically with the aim of assisting clients to evaluate and take advantage of business opportunities in the Middle East following the conclusion of the US-led war in Iraq.” They made Michael Moore’s Fahrenheit 9/11 the most commercially successful documentary ever distributed in the United States, earning in its domestic theatrical release $117.5 million. (By comparison, Moore’s 2002 Academy Award–winning Bowling for Columbine had earned less than $22 million.) They were said to be “energized,” “worked up,” motivated in a way they had not been even by Bush v. Gore, which had occurred at a time, nine months before the mandate offered by September 11, when it had still been possible to imagine the clouded outcome of the 2000 election as its saving feature, an assured deterrent to any who would exercise undue reach.
The July week in Boston of the Democratic National Convention, then, was for these citizens a critical moment, a chance to press their concerns upon the electorate, which had seemed during the month preceding the convention to be at least incrementally moving in their direction. By late June a Washington Post–ABC News poll had shown the President’s approval rating on the management of “the war against terrorism,” previously considered his assured ace in the hole, down thirteen points since April. In the same poll, the percentage of those who believed the war with Iraq “worth fighting” had dropped to 47 percent. By the week before the convention, the Los Angeles Times was reporting that its own polling showed that 54 percent of those questioned “say the nation is moving in the wrong direction,” and that “nearly three-fifths say the country should not ‘continue in the direction he [the President] set out’ and ‘needs to move in a new direction.'”
The Democratic nominee for president was nonetheless not a candidate with whom every Democrat who came to Boston could be entirely comfortable. Many of those impatient with what they saw as a self-defeating timidity in the way the party was presenting itself took refuge across the river in Cambridge, at “alternative” events improvised as the week went on by Robert Borosage of the Campaign for America’s Future. “Kerry, Kerry, quite contrary,” a group of young women calling themselves “Radical Cheerleaders” chanted outside Faneuil Hall. “So far right it’s kinda scary.” What troubled most was not exactly reducible to “right” or “left.” There was the question of Senator Kerry’s vote authorizing the President to use force in Iraq, and the unknowable ratio of conviction to convenience that prompted it. There was his apparent inability to say that, “knowing what we know now,” he would not cast that vote again.
In fact he said, in an astonishing moment of political miscalculation, since if there was any consensus forming in the country it was to the point of Iraq having been a bad idea, that he would. There was his fairly blank-check endorsement, again raising the convenience question, of whatever Israeli Prime Minister Ariel Sharon chose to do with the occupied territories. There was a predilection for taking cover in largely hypothetical distinctions (he had voted not “for” the war but for giving the President the authority to go to war) that struck many as uncomfortably close to what the Bush campaign had been saying about him all through the spring. Even to the basic question of his “electability,” or performance as a campaigner, which seemed to many in Boston the only reason he was being nominated, there had been a certain uneasiness from the outset, notably about the temperamental defensiveness that left him uniquely vulnerable to the kind of schoolyard bullying that was his opponent’s default tactic.
Advertisement
Yet his acceptance speech was a forthright demonstration of his intention to run for the presidency on his own terms, by no means the unconnected series of music cues that the DNC commercials later making use of it would tend to suggest. He had already put forth a number of detailed domestic proposals, the most expensive of which was a $650 billion health care plan that would offer protection to 27 million uncovered citizens and relieve businesses from the risk of increased premiums by having the government assume the cost of catastrophic care. He had said that he would pay for this by rolling back the Bush tax cuts for those making over $200,000 a year. He had said that Americans would be free to buy prescription drugs from outside the United States, not an insignificant commitment in light of the $2.5 million the pharmaceutical industry was spending as a sponsor of the Democratic convention, but not an entirely significant one either: for the pharmaceutical industry and the six hundred lobbyists it maintains, $2.5 million was fractional, only a skim of the $29 million it had spent in the 2002 election cycle and the $11.5 million it had spent to that point in 2004. In the 2004 cycle, more than twice the amount of pharmaceutical money paid directly to the Kerry campaign had gone to the Bush campaign.
Still, it was a stand not without substance, and it was not the only such stand Senator Kerry had taken: he had established so firmly that his campaign would not be hostage to the reliable wedge issues (“What if we had a president who believed in science?” he had asked both in the Fleet Center and at his morning-after appearance with John Edwards) that he could launch a challenge to the Bush administration and its base voters on the Christian right over the question of stem cell research. He had made clear that since neither he nor most other people in public life could claim any high ground on the question of whether America should have embarked on its Iraq venture (“People of good will disagree” had been the conciliatory formulation in the Democratic platform), our only practical recourse now was to let the argument go, regard past differences as moot, and turn to recognizing and repairing our alliances with the rest of the world.
He had been, in other words, as realistic and as specific as it was reasonable in the pressure of a campaign to expect a candidate to be. Yet on the evening he spoke in the Fleet Center and after, among those whose profession it was to talk about politics, the word had been that there was “no message,” “no substance,” “at the end of the day I don’t know what they stand for.” On MSNBC that evening, only Willie Brown had defended Kerry. “He could have hit a home run,” the others agreed, but hadn’t. “Missed Opportunity” was the headline on the lead editorial in The Washington Post the next morning. “We don’t know where they stand on free trade, or gay marriage,” Craig Crawford of Congressional Quarterly and CBS News complained on Imus. (Actually we did. According to their platform and Web site, they were for free trade with nations that recognized US labor and environmental regulation, i.e., “free and fair trade,” or “level playing field”; they were for equal legal rights for all domestic partners; they were against amending the Constitution to ban gay marriage.) “No substance at all,” David Broder of The Washington Post said on PBS that evening. “He wants to bring everyone together, so there’s nothing there.” “What an incoherent disaster,” David Brooks wrote in The New York Times a day or so later. “When you actually read for content, you see that the speech skirts almost every tough issue and comes out on both sides of every major concern.”
We heard that weekend about the Democratic candidate’s “aloofness,” about whether the electorate would be willing to overlook his “personality deficits,” about whether he had managed to “fill in the gap,” make “the human connection,” prompt an affirmative answer to the question long accepted as the most predictive in American electoral politics, the one about whether “you could imagine yourself sitting around the kitchen table with him sharing a beer.” We heard about the self-laid traps that awaited him and his running mate, mistakes already irreversible in a game in which the ideal candidate is seen to be one who has been prevented by unassailable duty from taking a stand on any issue.
Advertisement
It would be hard, Cokie Roberts said to this point onscreen, for John Kerry and John Edwards to run on their records in “some parts of the country”; she was referring not to their votes as senators to authorize the use of force in Iraq but to the votes they had cast in 1999 against banning the procedure referred to on the right as “partial-birth abortion” (in 2003, when the question came up again, neither had voted), an issue she seemed to count so central to the nation at large that in a span of seconds she amended “some parts of the country” to “most parts of the country.” “It is almost impossible to go through a 20-year record in the Senate and not be able to find things that might embarrass a candidate,” the presidential historian Michael Beschloss told Jodi Wilgoren of The New York Times, who had raised the question of whether congressional experience could not backfire, “turning otherwise successful politicians into bumbling candidates forced to defend lengthy legislative records the average governor—or, better yet, general—can avoid.”
The Democratic candidate, then, was unelectable because he “skirts almost every tough issue,” or the Democratic candidate was unelectable because during twenty years in the Senate he had amassed a record (votes against some weapons systems, for example, and for 1997 deficit reduction legislation that resulted in the recent rise in Medicare premiums) the average general could have avoided. Or, an alternate scenario, the Democratic candidate could be electable, but only if he narrowed his appeal to the impulses and whims of the “swing voter,” for the last several election cycles the phantom of the process, someone presumed to have no interest in politics who is nonetheless mysteriously available to talk to Paula Zahn for a CNN special or free up an evening to sit in on a focus group. Swing voters, we had been told at the time Senator Kerry named John Edwards as his running mate, would respond well to the vice-presidential candidate’s “sunniness,” his “optimism,” his “small town roots.” Swing voters, we had been instructed, responded negatively to “pessimism.” “I’m optimistic about America because I believe in the people of America,” the President was heard to say in an early round of television advertisements, and then, a startlingly unabashed suggestion from a president in whose term more jobs had evaporated than in that of any other president since Herbert Hoover: “One thing’s sure, pessimism never created a job.”
“Hope beats anger,” Al From and Bruce Reed had advised in a March memo to John Kerry published in the Democratic Leadership Council’s Blueprint Magazine. “Hope will beat fear every time,” Senator Mary Landrieu of Louisiana dutifully said during her turn on the Fleet Center platform. (“Hope,” in these approved constructions, tended to be not “hope for” but just “hope,” strategically unattached to possibly entangling specifics about what the objects of the hope might be.) This nonspeak continued, a product of the “discipline” imposed on convention speakers by the DNC and the Kerry campaign: the Democratic candidates, it was said repeatedly on the Fleet Center platform, would bring hope and optimism back to America, build a stronger and more secure America, stand up for the values that Americans cared about. Hope and values, it was said, were what Americans believed in. Americans believed in the values of good-paying jobs, in the values of affordable health care, in protecting our security and our values. When Elizabeth Edwards was campaigning for the Democratic ticket in Tennessee, according to The New York Times, she cautioned supporters who had spoken harshly about the President not to be “too negative,” not to use the word “hypocritical.” “It’s not useful,” Mrs. Edwards said, “because that kind of language for swing voters—they are tired of partisanship.” These voters, she advised, “don’t want to hear how lousy the other guy is. Talk about how your values inform what you are doing.”
This belief in the existence of Americans who did not “want to hear how lousy the other guy is” but did want to hear a cuckoo clock repetition of the word “values” was wisdom derived from focus groups, which made it tricky, as the lethal efficiency of the Bush campaign in basing its entire effort on “how lousy the other guy is” would demonstrate. Swing voters, Elizabeth Edwards had learned from this wisdom, “don’t want to hear how lousy the other guy is,” yet on the evening at the Republican convention when Senator Zell Miller of Georgia went negative on John Kerry (“more wrong, more weak and more wobbly than any other national figure…bowl of mush…. This politician wants to be leader of the free world—free for how long?”) the response among swing voters on whom MSNBC reported was strongly positive. “He was like the person next door,” one said. “Send a marine,” another said. (“And nothing makes this marine madder than someone calling American troops occupiers rather than liberators,” Senator Miller had said.) “Focus groups will tell you they hate negative ads and love positive ads,” the Democratic strategist Steve McMahon told Jim Rutenberg and Kate Zernike of The New York Times. “But call them back four days later and the only thing they can remember are the negative ones.”
Focus groups have long been routine in virtually every business that involves marketing, yet most people who use them recognize their inherent flaw, which is that the average person who turns out for one is at the moment of appearing or not appearing self-selected, and so either a little more or a little less interested (there to press an agenda, say, or there for the cold cuts) than he or she is supposed to be. The motion picture industry, for example, has used focus groups extensively since the early 1980s, when a former Gallup researcher named Joe Farrell introduced the technique first to the marketing and eventually to the conceptual stages of pictures. (Before Farrell and his National Research Group arrived on the scene, motion picture “research” had pretty much consisted of passing out cards at previews and asking for a rating of “Excellent,” “Good,” or “Fair.”) I recall a producer telling me how he came to have doubts about focus research: after showing an unreleased picture to a supposedly virgin audience at a mall in Thousand Oaks, he heard from one volunteer for the focus group that he “preferred the ending you tested in Torrance.”
On the Thursday night when John Kerry was delivering his acceptance speech in the Fleet Center in Boston, the Republican pollster Frank Luntz, at the request of MSNBC, had run such a group in a Cincinnati hotel. (Luntz is the pollster who advised California Republicans on how to win the recall of Governor Gray Davis. “While it is important to trash the governor,” he wrote in an internal memo reported in The San Francisco Chronicle, “it should be done in the context of regret, sadness, and balance.”) Luntz, it was widely reported, was “stunned” by his findings in Cincinnati. “It was one of the strongest positive reactions I’ve ever seen in a focus group,” Luntz said. “Kerry didn’t lose anybody. More importantly, he was able to convince [former Bush supporters in the group] that he is presidential, that he would be tough yet open-minded. They now see him as a credible commander-in-chief.”
The details of this session, during the days that followed the Democratic convention, were analyzed repeatedly, sifted and rubbed for meaning. Those present had ranged in age from nineteen to sixty-three. More than half had been men. A majority had voted for Bush in 2000, but only 40 percent were leaning toward voting for him in November. Disapproving reference to outsourcing had elicited strong positive responses. Outright criticism of the Bush administration’s conduct of either its “war against terror” or its actual war in Iraq had elicited negative responses: “Cheap shot,” “Pollyanna-ish,” “a vast oversimplification of what obviously is a very complicated problem,” “To look back and point fingers, any reasonable person would have done what Bush did.” Overall, however, the winner that July evening in Cincinnati was seen to be Kerry: “The biggest question mark for many of these swing voters was whether Kerry had the fortitude to fight terrorists,” The Cincinnati Enquirer reported. “After his speech Thursday, most decided he did. While a few still disparaged his war record, almost all of these Republican-leaning voters said it’s fair to consider Kerry a ‘war hero.'”
That this focus group had consisted of only twenty people seemed in no way—neither for Luntz nor for any of the weekend commentators who mentioned it—to lessen its perceived significance; the reduction of the American electorate to twenty people who lived in or near Cincinnati was in fact the elegance of the mechanism, the demonstration that the system was legible, the perfected codex of the entire political process. “This room,” Luntz had declared after asking only three questions at a similar group the evening before, “is George Bush’s greatest nightmare.” Again, the wisdom was tricky: thirty-four days later, on the second morning of the Republican National Convention in New York, The Washington Post reported that its own polling showed that during those thirty-four days, five weeks during which the words “swift boat” had been allowed to dominate the news cycles, Senator Kerry had lost, on issues having to do with “leadership in the war on terror,” fifteen points to President Bush. “That’s what it means to play offense with terrorism and not just defense,” as Rudy Giuliani said to another point on the first evening of the Republican convention in Madison Square Garden.
2.
“On one level nothing happens, but it is nothing at the very center of the world you are part of,” the Newsweek correspondent Howard Fineman said to The New York Times by way of explaining the apparently intractable enthusiasm of American reporters for covering political conventions. “You are immersed to the eyeballs in the concentrated form of the culture you cover.” A perception of nothing at the very center of the world you live in might suggest that a change is in order, yet no change was in sight: we had reached a point in our political life at which the selected among the 15,000 reporters who attended each of this summer’s conventions could dominate the national discourse by talking passionately to one another on air about, say, “strong women” (“There’s no reason to attack Heinz Kerry for it, in fact I admire strong women.” “I agree”), or about “women who could take very untraditional roles and yet transmit traditional values” (the subject here was Laura Bush, whose way of transmitting “traditional values” in the “untraditional role” of prime-time speaker was to address the questions “that I believe many people would ask me if we sat down for a cup of coffee or ran into each other at the store”), or about “missed opportunities,” for example “putting Jimmy Carter out there to talk about foreign policy missteps” during a Democratic convention at which, Joe Scarborough decreed on MSNBC, “he should have talked about values.”
We took for granted that we would learn nothing from these discussions that reflected the actual issues facing the country, nothing suggesting that in the world off camera “foreign policy missteps” might be understood as inextricable from “values.” We recognized that by tuning in we entered a world where actual information would vanish: a single Firing Line or Hardball was capable of wiping the average human hard drive. No one who had ever fallen asleep watching C-SPAN and woken to find, say, Cliff Kincaid of Accuracy in Media demanding to know why Michael Moore went after Halliburton (“but never tells you that the main competitor to Halliburton is Schlumberger—a French firm—do we really want a French firm? Why are we never told about that?”) could be unfamiliar with the obliterative effect of watching people shout at one another on a small lighted screen.
Cliff Kincaid of Accuracy in Media would give way on the small lighted screen to Lisa De Pasquale of the Clare Boothe Luce Policy Institute. Lori Waters of the Eagle Forum would be promised. Ann Coulter and Laura Ingraham would be in the green room. Dee Dee Myers would offer “the other side.” We knew that. We also knew that the election for its explicators would once again come down to “character,” the “human connection,” or what Laura Bush would tell you about her husband if you ran into her at the store. We were no longer even surprised that the ability of these explicators to read character seemed to have atrophied beyond conceivable repair: consider the way in which the raw fragility of Teresa Heinz Kerry was instantly metamorphosed into “strong woman,” or her husband’s pained shyness into “aloofness,” or the practiced courtroom affability of the plaintiff’s attorney who was his running mate into “sunniness.”
Or, most persistently, the calculated swagger of the President himself into “resolve.” I recall, shortly after September 11, at a time when the President was talking about “those folks,” “smoking them out,” “getting them running,” “dead or alive,” reading one morning in both The Washington Post and The New York Times about how his words were his own, the product of what the Post called his “unvarnished instincts.” “Friends and staffers,” the Post reported, lending a hand in what was already an ongoing effort, the creation of the President as commander-in-chief and intuitive manager of his war on terror, “promise that it is genuine Bush.” The Post headline on this story was “An Unvarnished President on Display.” The Times went with “A Nation Challenged: The President; In This Crisis, Bush Is Writing His Own Script.” Yet every morning, according to the same stories, the President met with senior advisers, including Vice President Cheney, Condoleezza Rice, Karen Hughes, and Andrew H. Card Jr., for a ten-minute “communication” session, the purpose of which was, in the Times’s words, “to strategize about the words, emotional cues, and information Mr. Bush should be conveying.”
There were two possibilities here: either the President was receiving his words, emotional cues, and information from his “unvarnished instincts,” or he was “strategizing” them with Vice President Cheney, Condoleezza Rice, Karen Hughes, and Andrew H. Card Jr. We no longer expected such contradictions to be explored, or even much mentioned. We accepted the fact that not only events but the language used to describe them had been reinvented, inflated, or otherwise devalued, stripped of meaning that did not serve a political purpose. The 2001 tax cut, we learned from former US Treasury Secretary Paul O’Neill, was described by its political beneficiaries in the White House as “the investment package.” The words “invasion” and “occupation,” previously neutral terms in the description of military actions, had each been replaced by the more educational “liberation,” to a point at which the administration’s most attentive and least wary student, Condoleezza Rice, could speak without irony to the Financial Times about “the devotion of the US in the liberation of Germany from Hitler.”
September 11, we were told repeatedly, had created a “new normal,” an altered condition in which we were supposed to be able to see, as The Christian Science Monitor explained a month after the events, “what is—and what is no longer—important.” “Government,” for example, was “important again,” and “all that chatter about lockboxes and such now seems like so much partisan noise.” The “new normal” required that we adopt a “new paradigm,” which in turn required, according to an internal White House memo signed by President Bush, “new thinking in the law of war,” in other words a reconsideration of the Geneva Convention’s prohibition against torture. “Torture” itself had become “extreme interrogation,” which under the “new paradigm” could be justified when the information obtained by interrogation failed to tally with the information required by policy. “We’re learning that Tariq Aziz still doesn’t know how to tell the truth,” the President told reporters in May 2003 about the interrogation sessions that were yielding, for reasons even then inconveniently clear, so little information about Iraq’s weapons of mass destruction. “He didn’t know how to tell the truth when he was in office. He doesn’t know how to tell the truth when he’s been—as a captive.”
As this suggests, the word “truth” itself had by then been redefined, the empirical method abandoned: “the truth” was now whatever we needed it to be, the confirmation of those propositions or policies in which we “believed in our hearts,” or had “faith.” “Belief” and “faith” had in turn become words used to drop a scrim, white out the possibility of decoding—let alone debating—what was being said. It was now possible to “believe” in one proposition or another on the basis of no evidence that it was so. The President had famously pioneered this tactic, from which derived his “resolve”: he “believed” in the weapons of mass destruction, for example, as if the existence of weapons was a doctrinal point on the order of transubstantiation, and in the same spirit he also believed, he told reporters in July 2003, that “the intelligence I get is darn good intelligence and the speeches I have given are backed by good intelligence.” The attraction of such assertions of conviction was the high road they offered for bypassing conventional reality testing, which could be dismissed as lack of resolve. “I do not believe we should change our course because I believe in it,” Tony Blair was saying by September 2003. “I carry on doing the job because I believe in what I’m doing.”
Similar use was found for the word “faith,” originally introduced as a way to placate Republican base voters while spending, since few elected officials are anxious to go on the line against faith, the minimum amount of political capital. The President could have “faith” in the Iraqi people, which in turn was how he could “believe” that “a free Iraq can be an example of reform and progress to all the Middle East,” which could even be (why not?) the reason we were there. Similarly, as he considered “problems like poverty and addiction, abandonment and abuse, illiteracy and homelessness,” the President could again have “faith,” in this case “faith that faith will work in solving the problems.” As for faith’s problem-solving role, or “compassionate conservatism,” the specific promise to the Christian right of the 2000 campaign, the administration now spoke not only of “faith-based” schools and “faith-based” charities and “faith-based” prisoner rehabilitation but also of “faith-based” national parks, which translated into authorizing the sale in the National Park Service’s bookstores of Grand Canyon: A Different View, the “different view” being that the canyon was created not by the continual movement of the Colorado River since the Tertiary Period but in the six days described in Genesis.
Peculiarities (faith-based national parks, say) that a few years before might have seemed scarcely possible now seemed scarcely worth remark. The more high-decibel political comment had become, the more blunted it had become, the more confined to arguments about “personality.” “What a difference these few months of extremism have made,” Jimmy Carter said in the Fleet Center on the opening night of the Democratic National Convention; on the cable shows that evening any potential discussion of what a former president of the United States might have meant by “extremism” got beaten back by the more pressing need to discuss his “cranky” refusal to allow his speech to be “scrubbed” of negativity by the Kerry campaign. We had seen the criticism of administration policy on Iraq doggedly offered by Senator Robert C. Byrd met with personal vilification, what Senator Byrd himself described in Losing America as “an ugly tone—’old man,’ ‘senile,’ ‘traitor,’ ‘KKK.'” We had seen, after the lead singer of the Dixie Chicks made a comment onstage in London that could only with imaginative interpretation pass for “political” (“Just so you know, we’re ashamed the president of the United States is from Texas”), widespread excoriation, radio bans against including the Dixie Chicks on playlists, and organized bonfires (at the time widely described as illustrations of market choice) in which their CDs were burned.
Rapid response, then, all barrels firing, would seem to have become the national political style, the manifestation of what was frequently called “polarization,” yet it was not. The notion of “polarization” itself had come to seem another manipulation, one more scrim: the 2001 USA Patriot Act, despite voiced reservations that crossed conventional ideological lines, had been passed by the House with a vote of 357 to 66 and in the Senate with a vote of 98 to 1, the “one” in the latter case being Senator Russell Feingold of Wisconsin. We had more recently seen, when a former longtime member of the House of Representatives, Lee Hamilton, suggested at a hearing of the Senate Governmental Affairs Committee that the recommendations of the 9/11 Commission could be put in effect by “executive order,” not only no polarization but virtually no response, no discussion of why someone who had long resisted the expansion of executive power now seemed willing to suggest that a major restructuring of the government proceed on the basis of the President’s signature alone. “And usually, given my background, you’d expect me to say that it’s better to have a statute in back of it,” he had added. Was he suggesting a way to shortcut the process on only minor points? Or, since he seemed to be talking about major changes, was he simply trying to guide the Senate to the urgency of the matter?
Such questions did not enter the discourse. There was only silence, general acquiescence, as if any lingering public memory of separation of powers had been obliterated in the unendable crisis the executive branch had appropriated for itself. “The battle in Iraq is one victory in a war on terror that began on September 11, 2001, and still goes on,” the President had said on the late afternoon he landed in the flight suit on the deck of the carrier Abraham Lincoln (“clearly reliving his days as a pilot in the Texas Air National Guard,” according to The New York Times), at once declaring combat operations complete on one front and laying a groundwork for whatever further fronts might be deemed expedient.
There had been many curious occurrences that might have earned our attention. There had been the reemergence of Elliott Abrams from the black hole of Iran-contra, this time around as the White House director of Middle Eastern affairs. “Whatever controversy there was in the past is in the past,” was how a senior administration official characterized, for The New York Times, the appointee’s 1987 guilty plea on a charge of withholding information from Congress and subsequent pardon by the President’s father. There had been the reemergence from the same black hole of Otto Juan Reich, who had once figured in questions about the Reagan administration’s covert campaign against the government of Nicaragua and was in 2001 given, after the Senate refused to confirm his appointment as assistant secretary of state for Western Hemisphere affairs, a “recess appointment” by the President. In 2002, when the recess appointment ran out, he was named the state department’s “special envoy” to the Western Hemisphere, a post not requiring confirmation.
There had been, albeit briefly, the reemergence of Reagan national security adviser John M. Poindexter, whose 1990 conviction on five Iran-contra-related felony counts was later overturned and who returned to the public from the private sector in 2002 as director of the Defense Department’s “Information Awareness Office,” a division of its Defense Advanced Research Projects Agency, or DARPA. In his twenty months at the Information Awareness Office, Admiral Poindexter’s projects included an on-line futures market for betting on international developments and the prototype of an all-inclusive database for tracking pretty much anyone in the world. This prototype, the eventual point of which was to combine all government with all commercial data, involved, according to the DARPA Web site, the development of such suggested technologies as “story telling, change detection, and truth maintenance” and “biologically inspired algorithms for agent control.”
There had even been the reemergence of the Iran-contra arms dealer Manucher Ghorbanifar, who was reported to have had “several” meetings with two members of Douglas Feith’s Pentagon staff. Newsday had originally placed these meetings in Paris; The New York Times later placed them in Rome. One of the two men present from the Pentagon, according to the Times, was Lawrence Franklin, who was this summer reported to be under investigation by the FBI in a matter that allegedly involved providing classified documents to the American Israel Public Affairs Committee and ultimately to Israel. The other Pentagon representative at the Ghorbanifar meetings, according to Newsday, was Harold Rhode, who had “acted as a liaison between Feith’s office, which drafted much of the administration’s post-Iraq planning, and Ahmed Chalabi, a former Iraqi exile disdained by the CIA and State Department but groomed for leadership by the Pentagon.” Here the story, as reported by Newsday, took still another turn into time travel:
Rhode is a protege of Michael Ledeen, a neo-conservative who was a National Security Council consultant in the mid-1980s when he introduced Ghorbanifar to Oliver North, a National Security Council aide, and others in the opening stages of the Iran-Contra affair. A former CIA officer who himself was involved in some aspects of the Iran-Contra scandal said current intelligence officers told him it was Ledeen who reopened the Ghorbanifar channel with Feith’s staff. Ledeen, a scholar at the American Enterprise Institute and an ardent advocate for regime change in Iran, would neither confirm nor deny [note: according to The New York Times, he later confirmed] that he arranged for the Ghorbanifar meetings.
What were we doing here? What kind of profound amnesia had overtaken us? How had it taken hold, come to prevent the laying down of not only political but cultural long-term memory? Could we no longer hold a thought long enough to connect it to the events we were seeing and hearing and reading about? Did we not find it remarkable that the recommendation of the 9/11 Commission to concentrate our intelligence functions in the White House would have been met with general approval? That former members of Congress would urge action by executive order to enact a plan that would limit the congressional role to “oversight”? That the only reservations expressed would be those reflecting issues of agency turf?
Did we not remember the Nixon White House and the point to which its lust for collecting intelligence had taken it? The helicopter on the lawn, the weeping daughter, the felony indictments? Did we not remember what “congressional oversight” had recently meant? Did we have no memory that the Reagan administration had been operating under congressional oversight even as it gave us Iran-contra? Had we lost even the names of the players? Did “Manucher Ghorbanifar” no longer resonate? Had we lost all memory of Ronald Reagan except in the role assigned him by his creators and certified by the coverage in the week of his death, that of “sunny optimist”? Did we not remember that it was his administration, through its use of Islamic fundamentalists to wage our war against the Soviet Union in Afghanistan, that had underwritten the dream of unending jihad? Was no trace left of what we had learned about actions and their consequences?
3.
In March of 2003, before the war in Iraq had begun, Robert M. Berdahl, at that time chancellor of the University of California at Berkeley, wrote, in The San Francisco Chronicle, an Op-Ed piece critical of the Bush administration’s foreign policy. He later spoke to the Berkeley alumni magazine, California Monthly, about his reasons for writing the Op-Ed piece, which he had recognized might antagonize some supporters of the university. Given his position, he said, he believed it correct to speak out only on issues critical to the university’s future. He believed the Bush foreign policy to be such an issue. He believed, he said, that we were experiencing a fundamental change not just in foreign policy but in “the fabric of constitutional government as we have known it in this country.” He was troubled that the doctrine of preemption had been adopted with so little congressional discussion. He was troubled that sweeping war powers had been granted with so little dissent. He was troubled by the way in which the Patriot Act allowed the government to subpoena university library records, medical records, and student records generally while binding the university to secrecy. He was troubled, finally, by the tenor of the discourse, which he saw as forcing universities into a dichotomized way of thinking, one in which “the critical faculty of understanding and recognizing the validity of conflicting points of view” could diminish to the vanishing point.
These were not uncommon concerns, yet they were concerns, during that period, discussed only rarely in the daily and weekly forums from which most Americans derived their understanding of what the government of the United States was doing and why it was doing it. Such concerns, when they were discussed, tended to be dismissed as dated, the luxuries of less threatening times; confessions no longer, in the “new normal,” relevant. Attention was drawn instead to what seemed increasingly to be strategic diversions, sophistic arguments about the possibility of proving the existence or nonexistence of weapons of mass destruction, say, or conveniently timed “Homeland Security” alerts that flared and vanished, or the encouragement of nativist impulses. (Our borders were porous, the world beyond them “hated our way of life,” the United Nations was in the words of Condoleezza Rice “playing into the hands” of Saddam Hussein, Senator Kerry “looked French.”)
On the question of what use the administration might be making of its alerts and of its “war on terror” in general, there was most notably a fastidious reticence, a disinclination to speak ill encouraged by both the political fearfulness of the President’s opponents and the readiness of his supporters to suggest that only traitors disagreed with him. “The middle part of the country—the great red zone that voted for Bush—is clearly ready for war,” Andrew Sullivan had written in the London Sunday Times shortly after September 11, sounding the note that would see the current president through his first term and provide the momentum for his second campaign. “The decadent left in its enclaves on the coasts is not dead—and may well mount what amounts to a fifth column.”
This association of the administration with what had become known as “the heartland,” alienated from and united against a tiny overentitled minority “in its enclaves on the coasts,” a notion made graphic by the red-blue maps illustrating the 2000 election, was on its face misleading; the popular vote was basically even, and just one of those “enclaves on the coasts,” California, represented that year not only 12 percent of the US population but the world’s fifth-largest economy. Again, however, the projection of a “decadent” coastal minority was useful, in the same way the perceived intransigence of the United Nations and France was useful: the introduced specter of movie stars and investment bankers making common cause in attractive West Los Angeles and eastern Long Island venues had come to constitute, as the issue of school prayer and the words “abortion on demand” constituted, a straight-to-the-bloodstream intravenous infusion of the kind of class resentment that powered the Republican vote.
4.
Winston could not definitely remember a time when his country had not been at war, but it was evident that there had been a fairly long interval of peace during his childhood, because one of his early memories was of an air raid which appeared to take everyone by surprise…. Since about that time, war had been literally continuous, although strictly speaking it had not always been the same war. For several months during his childhood there had been confused street fighting in London itself, some of which he remembered vividly. But to trace out the history of the whole period, to say who was fighting whom at any given moment, would have been utterly impossible, since no written record, and no spoken word, ever made mention of any other alignment than the existing one. At the moment, for example, in 1984 (if it was 1984), Oceania was at war with Eurasia and in alliance with Eastasia…. Actually, as Winston well knew, it was only four years since Oceania had been at war with Eastasia and in alliance with Eurasia. But this was merely a piece of furtive knowledge which he happened to possess because his memory was not satisfactorily under control. Officially the change of partners had never happened. Oceania was at war with Eurasia: therefore Oceania had always been at war with Eurasia. The enemy of the moment always represented absolute evil, and it followed that any past or future agreement with him was impossible.
—1984, by George Orwell, Part One, Chapter 3
Such was the state of mind in which many of us discovered ourselves at one point or another during the recent past: our memories were not satisfactorily under control. We still possessed “pieces of furtive knowledge” that were hard to reconcile with what we read and heard in the news. We saved entire newspapers, hoping that further study might yield their logic, but none emerged. Why, for example, on April 25, 2003, a day on which it was reported that the President had suggested for the first time that we might not “find” Saddam Hussein’s weapons of mass destruction (“but we know he had them”), did we seem officially unconcerned about the report in the same day’s papers that North Korea claimed to possess exactly the weapons we were failing to find in Iraq? The explanation, according to “administration sources” quoted in that morning’s Los Angeles Times, was that any reports of the North Korean claim were “leaks,” which had come “from administration insiders opposed to Bush’s efforts to negotiate a settlement with North Korea.” Did the assertion that the information had been leaked materially affect the credibility of the information? Were we at war in Iraq but not in North Korea because a decision had been made that we could afford Iraq? Had we not recently supported Saddam Hussein as we were now trying to support Pyongyang? At what point would Iraq again become Eastasia, and North Korea Eurasia? Would we notice?
On September 6, 2003, The Washington Post published on its first page a story reporting that 69 percent of Americans, in a consensus broadly shared by Democrats, Republicans, and independents, at that time believed it “at least likely” that Saddam Hussein had been involved in the September 11 attacks on the World Trade Center and the Pentagon. “Bush’s defenders say the administration’s rhetoric was not responsible for the public perception of Hussein’s involvement,” Post reporters Dana Milbank and Claudia Deane wrote, and quoted Bush campaign strategist Matthew Dowd: “The intellectual argument is there is a war in Iraq and a war on terrorism and you have to separate them, but the public doesn’t do that. They see Middle Eastern terrorism, bad people in the Middle East, all as one big problem.”
The source of any misunderstanding, then, was “the public,” not the President (who said as recently as June 17 that “the reason I keep insisting that there was a relationship between Iraq and Saddam and al-Qaeda is that there was a relationship between Iraq and al-Qaeda”), not Richard N. Perle (who had called the evidence for the putative Iraqi involvement “overwhelming”), not even, it seemed, Paul D. Wolfowitz. “I’m not sure even now that I would say Iraq had something to do with it,” Wolfowitz had said a month before on The Laura Ingraham Show. Yet seven months before that, at the Council on Foreign Relations in New York, he had foregone the opportunity to make “the intellectual argument” that “there is a war in Iraq and a war on terrorism and you have to separate them” and instead said this:
Iraq’s weapons of mass terror and the terror networks to which the Iraqi regime are [sic] linked are not two separate themes—not two separate threats. They are part of the same threat. Disarming Iraq and the War on Terror are not merely related. Disarming Iraq of its chemical and biological weapons and dismantling its nuclear weapons program is a crucial part of winning the War on Terror.
The effort to shift responsibility for the wreckage that had been our Iraq policy had become, by this spring and summer, general, spreading from those who had most fervently made the war to those who had most ardently backed it. David Brooks, we learned on the Op-Ed page of The New York Times on April 17, 2004, had “never thought it would be this bad.” (Just seven days before, in the Times dated April 10, the same David Brooks had advised “Chicken Littles” on the subject of Iraq to “get a grip.”) He “didn’t expect,” he now allowed, that “a year after liberation, hostile militias would be taking over cities or that it would be unsafe to walk around Baghdad.” Most of all, he had “misunderstood” the way in which “normal Iraqis” might respond to the American occupation. Thomas J. Friedman, in May 2004, again on the Op-Ed page of the Times, admitted to having been “a little slow,” but admirably so: he had tried, he disclosed, “to think about something as deadly serious as Iraq, and the post-9/11 world, in a bipartisan fashion.” His only error, in this construct, had been generous, one of attributing the same approach to others: he had “assumed the Bush officials were doing the same.”
The potential for cover in having “tried to think in a bipartisan fashion” was immediately apparent. “Were We Wrong?” The New Republic asked itself on the cover of a special June 2004 issue. The answer inside was yes, no, mea culpa but not exactly, not in the larger framework. (“We hawks were wrong about many things,” David Brooks had written in the Times, an early responder to the larger-framework approach. “But in opening up the possibility for a slow trudge toward democracy, we were still right about the big thing.”) Peter Beinart of The New Republic had “worried,” but, like Thomas J. Friedman in the Times, he had also “assumed.” He “didn’t realize.” He “might have seen some of the war’s problems earlier than I did” had he “not tried so hard” to separate his thinking from “partisanship.”
Senator Joseph Biden, also in The New Republic, believed his vote for the war to have been “just,” but had “never imagined” the lack of wisdom with which the war would be pursued. “I am not embarrassed by my assumption that Saddam Hussein possessed the sort of arsenal that made him a clear and present danger,” Leon Wieseltier declared in the same New Republic. The cadences surged: “And so I was persuaded,” “Prudence and conscience brought me to the same conclusion,” “But I was deceived.” As for the collective “we” that represented the magazine’s editors, they could see “in retrospect” that there might have been “warning signs,” to which “we should have paid more attention.” “At the time,” however, “there seemed good reason not to,” and, in any case (the larger framework again), “if our strategic rationale for war has collapsed, our moral one has not.”
For Fouad Ajami it had been “an honorable and noble expedition.” Leon Wieseltier could “imagine no grander historical experiment in our time than the effort to bring a liberal order to an Arab society.” David Brooks could see the Iraq we had made as one in which “nationalism will work in our favor, as Iraqis seek to become the leading reformers in the Arab world.” For these early enthusiasts, then, the “expedition” was in the past, its “moral rationale” intact, its errors not their own. The “historical experiment” was over. That it might have already passed beyond the limits of our control was not, in the thousands of words of self-examination that appeared during this period, a consideration.
5.
There seemed in New York on the September Friday morning after the balloons finally fell in Madison Square Garden a relief so profound as to approach euphoria. The President was gone, spirited from the Garden and the city to campaign from the porches and yards of those “undecided” citizens who had become as familiar as our neighbors. The black SUVs with police escorts were gone (even obscure political figures had seemed to “need” motorcades, and not only motorcades but Secret Service protection); the whine of the helicopters was gone. The low dread that had afflicted the city was dissipating. There would be no further need to plot movements around town so as not to be caught in the orange netting of one or another police sweep. (“You can’t arrest 1,800 people without having somebody in the middle who shouldn’t have been arrested,” the mayor of New York said to this point, not reassuringly, on WABC-AM. “That’s what the courts are there to find out afterwards.”) There would be no further need to regard an official credential to enter the Garden (or “the perimeter,” as the sealed area was called) as our sole protection, our fragile laissez-passer in a city that might at any time close down around us. (Closing down the city around us was called “expanding the perimeter.”) “If you’re leaving the perimeter, hide that credential,” one convention aide warned me as I was leaving the Garden. “Because they are definitely out there.”
The Republican convention, then, had done what September 11 never did, rendered the city embattled, an armed camp, divided between “they” and “we.” This new mood had been reinforced by the convention itself, a stated theme of which was “A Safer World, A More Hopeful America” but the persistent message of which was that any notions of safety or hope we might have entertained were but reeds dependent for their survival on the reelection of the incumbent administration. “It’s absolutely essential that eight weeks from today, on November 2, we make the right choice,” Vice President Cheney would say a few days later, in Des Moines, and no one who had listened to what was said in Madison Square Garden could have been unaware that this had been its subtext: “Because if we make the wrong choice then the danger is that we’ll get hit again and we’ll be hit in a way that will be devastating from the standpoint of the United States.”
That entire week in New York had been, not unexpectedly, an exercise in the political usefulness of keeping a nation in a state of unending crisis. The events of September 11, we were told repeatedly in Madison Square Garden, had “changed everything.” We had entered “a new age of terrorism.” Republicans understood this. Democrats did not. “Even in this post-9/11 period,” Vice President Cheney said on the night he accepted his renomination, “Senator Kerry doesn’t appear to understand how the world has changed.” This changed world demanded “strong leadership,” “conviction,” above all “resolve,” a quality understood in the Garden to be the President’s long suit. “He has not wavered,” John McCain declared on the first night of the convention. “He has not flinched from the hard choices. He will not yield.” Rudy Giuliani, on the same night, claimed to have looked at the falling towers and said to Bernard Kerik, then the police commissioner: “Bernie, thank God George Bush is our president.”
The changed world also demanded that the President be allowed to demonstrate his unwavering resolve unhindered by possible disagreement from the nation’s citizens. Demonstrators were corralled outside his sight line, penned behind movable barricades, kept under the watch of closed-circuit video cameras and an NYPD surveillance blimp. Not only demonstrators but also members of the opposition party could be seen as enemies of the republic. “Where is the bipartisanship in this country when we need it most?” Senator Zell Miller demanded, winding up for his fairly unveiled suggestion that nominating an opposition candidate for the presidency fell into the category of treason. (Locating treason was not a new task for Senator Miller, who had in May said on the Senate floor that those discussing the abuses at Abu Ghraib were “rushing to give aid and comfort to the enemy.”) “Now, while young Americans are dying in the sands of Iraq and the mountains of Afghanistan,” he shouted from the podium to positive response, “our nation is being torn apart and made weaker because of the Democrats’ manic obsession to bring down our commander-in-chief.”
Yet this “changed world,” as presented in the Garden, also demanded assent to the President on certain fronts, notably domestic, that did not actually involve his role as “commander-in-chief.” This was the stealth part of those four days in the Garden. There were many evasions about what the President had actually done, or wanted to do, on those domestic fronts. There were misrepresentations. It was the President, we learned from Laura Bush, who had been “the first president to provide federal funding for stem cell research,” which was technically true but misleading; embryonic stem cells were first isolated in 1998; the Department of Health and Human Services ruled in 1999 that such research could receive federal funding; the National Institutes of Health set guidelines for such funding in 2000; the President limited this research to “existing lines,” which numbered no more than twenty-one and are now considered less safe than new lines, in 2001. The President, then, was not “the first president to provide federal funding” but the first president to claim any control whatsoever over, and in real terms to cut off, the funding that already existed.
Nor, if there remained any doubt about the efficiency of the operation, was Madison Square Garden the first venue to which Mrs. Bush had been dispatched with this sly message for anyone who might have listened to Nancy Reagan or seen her son at the Democratic convention in Boston. There were other misrepresentations. We heard a good deal about how this president was, in the words of the vice- president, “making health care more affordable and accessible to all Americans,” as well as “reforming medical liability so the system serves patients and good doctors, not personal injury lawyers.” The President himself told us how he was “honoring America’s seniors” by “strengthening Medicare,” “creating jobs” by “reducing regulation and making tax relief permanent.” He was, we heard, “building an ownership society,” in which “people will own their own health plans and have the confidence of owning a piece of their retirement.”
We heard about “health savings accounts,” and about “reforming Social Security.” We heard, to the latter point, how we could “strengthen Social Security by allowing younger workers to save some of their taxes in a personal account—a nest egg you can call your own and government can never take away.” We did not hear what would happen when those “younger workers” reached retirement age and realized that the “individual marketplace decisions” they had made for their “personal nest eggs” had proved unwise, or when the “health savings account you own yourself” got emptied by unexpected illness. “The magnitude of the Bush proposals is only gradually dawning on members of Congress,” Robin Toner and Robert Pear had reported in The New York Times in February 2003, an assessment suggesting that members of Congress were less acutely aware of their vulnerabilities than the rest of us were. To read the Republican platform on this subject was in fact to enter a world in which no unexpected or catastrophic events could occur, a world in which we ourselves, not our employers, would pay insurers, but not exactly to “insure” us: one way the party would restore “choice” to health insurance, for example, was by overruling state laws requiring insurers “to provide benefits and treatments which many families do not want and do not need.”
In this “changed world,” then, one thing remained unchanged: the primacy, for this administration, of its domestic agenda, the relentless intention to dismantle or “reform” American society for the benefit, or “protection,” since the closest model here was a protection racket, of those segments of the business community that supported the President. Everything said in Madison Square Garden on domestic issues was predictable. We knew what the domestic agenda was about. We had known it since the 2000 campaign, when the same messages got sent. We had seen clear-cutting our national forests described as “wildfire control,” part of the “Healthy Forests Initiative.” We had seen the administration distract us with arguments about whether our national parks should be “faith-based,” even as that administration lifted the regulation of snowmobiles in the same national parks.
We had seen the President “right the wrong,” as Senator Bill Frist put it in the Garden, of “miracle medicines denied by Medicare,” and we had also seen who benefited from “righting this wrong”: there would first be, since the law as enacted banned Medicare from negotiating the price of drugs, the pharmaceutical industry. Then there would be, still more meaningfully, since the drug benefit was to be offered only through private insurers and health plans (despite the fact that it cost Medicare significantly more to cover recipients through private plans than directly), the insurance industry. Finally, in the case of those Medicare recipients currently covered under their retirement plans, there was the considerable benefit to their former employers, who, by the government’s own estimates, were expected to reduce or eliminate drug coverage for 3.8 million retirees. Those who lost coverage would then be forced, if they wanted a drug benefit at all, out of not only their retiree plan but also of Medicare’s fee-for-service coverage, in other words into an HMO.
Such “improved benefits,” like “personal nest eggs” and “healthy forests,” had been since 2000 what was meant when the Bush administration talked about restoring “choices” to Americans. What made these misrepresentations seem more grave in 2004 was the larger misrepresentation: the fact that the administration had taken us, ineptly, with the aid and encouragement of those who had “never thought,” or who had “misunderstood,” or who “didn’t realize,” into a war, or a “noble expedition,” or a “grand historical experiment,” which was draining the lives and futures of our children and disrupting fragile arrangements throughout the world even as it provided the unending “crisis” required to perpetuate the administration and enact its agenda. “This is a great opportunity,” the President was reported by Bob Woodward to have said in an NSC meeting on the evening of September 11, 2001. That large numbers of Americans continued to support him could be construed as evidence of their generosity, but it was also evidence of how shallowly rooted our commitment to self-government had turned out to be.
—September 23, 2004
This Issue
October 21, 2004