1.

Fifty years after its proclamation, the Universal Declaration of Human Rights has become the sacred text of what Elie Wiesel has called a “world-wide secular religion.”1 UN Secretary General Kofi Annan has called the Declaration the “yardstick by which we measure human progress.” Nobel Laureate Nadine Gordimer has described it as “the essential document, the touchstone, the creed of humanity that surely sums up all other creeds directing human behaviour.”2

Human rights has become the major article of faith of a secular culture that fears it believes in nothing else. The military campaign in Kosovo depends for its legitimacy on what fifty years of human rights has done to our moral instincts, weakening the presumption in favor of state sovereignty, strengthening the presumption in favor of intervention when massacre and deportation become state policy. Yet what this new presumption commits us to is anything but clear. Do we intervene everywhere or only somewhere? And if we don’t intervene everywhere, does that make us hypocrites? And then what price are we prepared to pay? For some the question is how much collateral damage can moral internationalism sustain before defensible ends are tarnished by harrowing means. For others, the issue is whether moral universals are worth anything unless you are prepared to commit blood and treasure to their defense.

As the liberal conscience passes through an hour of trial in Kosovo, it is worth going back to the beginning again, to look hard at human rights and the moral universals we believe to be at stake. Article One of the Universal Declaration simply declares: “All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood.” The Universal Declaration enunciates rights; it doesn’t explain why people have them. As Johannes Morsink makes clear in his revealing and useful history of the drafting of the Universal Declaration, this silence was deliberate. When Eleanor Roosevelt first convened a drafting committee in her Washington Square apartment in February 1947, a Chinese Confucian and a Lebanese Thomist got into such an argument about the philosophical and metaphysical bases of rights that Mrs. Roosevelt concluded that the only way forward lay in West and East agreeing to disagree.

It was also apparent that the less said the better about the gap between what the signers practiced and what they preached. Every one had something to be ashamed of—the Americans their Jim Crow legislation in the South, the Canadians their treatment of native peoples, the Soviets the Red Terror. The embarrassing state of the “is” kept all eyes firmly focused on the “ought.” Agreement on high principles was also made easier by leaving the matter of their enforcement entirely unresolved. Nothing in the Declaration mandated the right of member states to intervene in another country’s affairs to stop human rights abuses. The UN Charter’s guarantee of state sovereignty was left untouched. Instead, the drafters put their hopes in the idea that by declaring rights as moral universals, they could foster a global rights consciousness among those they called “the common people.”

A cloak of silence was also thrown over the question of God. The Brazilian delegation proposed that Article One include the proposition that men are “created in the image and likeness of God,” and “endowed with reason and conscience.” Communist and non-Communist delegations joined in rejecting these totemic references on the grounds that they would detract from the Declaration’s universal appeal. The Brazilians tried again, replacing “created in the image and likeness of God” with “by nature,” but the Nationalist Chinese delegate prevailed on the Brazilian delegation to “spare the members of the Committee the task of deciding by vote on a principle which was in fact beyond the capacity of human judgment.”

This secularism has become the lingua franca of global human rights, as English has become the lingua franca of the global economy. Both serve as lowest common denominators, enabling people to pretend to share more than they actually do. Pragmatic silence on ultimate questions has made it easier for the world’s very different cultures to sign up. As the philosopher Charles Taylor puts it, the concept of human rights “could travel better if separated from some of its underlying justifications.”3 The Declaration’s vaunted universality is as much a testament to what the drafters kept out of it as to what they put in.

What they did put in was a comprehensive attempt to outlaw the kind of jurisprudence which the Nazis had used to pervert the rule of law in Germany. The Article Sixteen provisions for free marriage choice, which have aroused so much resistance in the Islamic world, were not actually directed at Islam at all, but at the Nuremberg Laws banning marriages between “Aryan” and Jewish Germans. The right to legal personality, enshrined in Article Six, was explicitly written with the memory of the German expropriations of Jewish property in mind. Beyond the specifics of Nazi jurisprudence hung the shadow of the Holocaust itself. The Declaration’s opening preamble evokes the memory of “barbarous acts which have outraged the conscience of mankind.” The Declaration may still be a child of the Enlightenment, but it was written when faith in the Enlightenment faced its deepest crisis of confidence.

Advertisement

The Holocaust made the Declaration possible, but its influence was also deeply paradoxical. The Declaration envisioned a world where, if human beings found their civil and political rights as citizens were taken away, they could still appeal for protection on the basis of their rights as human beings. Beneath the civil and political, in other words, stood the natural. But the Holocaust showed that once civil and political rights were taken away, human beings were defenseless. As Hannah Arendt argued in her Origins of Totalitarianism, published in 1951, when Jewish citizens of Europe were deprived of their national or civic rights, when, finally, they had been stripped naked and could only appeal to their captors as plain, bare human beings, they found that their nakedness did not even give them the claim of common human pity on their tormentors. “It seems,” Arendt wrote, “that a man who is nothing but a man has lost the very qualities which make it possible for other people to treat him as a fellow man.” The Universal Declaration set out to reestablish the idea of human rights at the precise historical moment in which they had been shown to have no moral purchase whatever.

This paradox defines the divided consciousness with which we have lived with the idea of human rights ever since. We defend human rights as moral universals in the full awareness that in a place like Kosovo moral universals are unlikely to stay the hands of those bent on massacre and deportation. But we have lived with this knowledge since the Holocaust.

The Holocaust laid bare what the world looked like when natural law was abrogated, when pure tyranny could accomplish its unbridled will. Without the Holocaust then, no Declaration. Because of the Holocaust, no unconditional faith in the Declaration either.

Even so, the Declaration might never have been drafted had the times not conspired to postpone ideological arguments which might otherwise have wrecked it. In February 1947 the cold war was already underway but not yet so envenomed with nuclear paranoia as to make all headway impossible. While odious figures like Andrei Vyshinsky—Stalin’s prosecutor in the Red Terror of 1937 and 1938—participated in the deliberations and made sure that the Soviet bloc, including Yugoslavia, abstained in the final vote on the Declaration, they did not sabotage it altogether as they would have done before long. The Chinese seat on the drafting committee was held by a scholarly Confucian named Chang. Two years later, the Chinese delegate might have been a nominee of that great friend of human rights, Mao Zedong.

Likewise, decolonization was underway but as yet the hegemony of Western rights discourse had not come under challenge. With India and Pakistan already independent and the Dutch and the French starting to quit their Asian colonies, the waning imperial powers had to concede that the Declaration applied to their existing colonies. At the same time, the newly independent nations, most of whose leaders had received a Western education, did not yet feel impelled to insist upon the radical distinctiveness of their moral traditions. The descent of so many of these newly independent states into dictatorship or civil war had not yet occurred. It was still possible to believe that winning independence and freedom as a state would be enough to guarantee the freedoms of the individuals inside it. The emergence of the Asian Tiger economies and the rebirth of radical Islam were still decades away. The great philosophical conflict between “the West and the Rest,” which has called into question the universality of human rights, still lay in the future.

The other factor which made agreement possible in 1948 was that the West itself was still one. The Declaration belongs to the brief postwar moment when the drafters still shared a progressive cast of mind. Eleanor Roosevelt incarnated the New Deal. John Humphrey, the Canadian law professor who wrote the first draft of the Declaration, had links to his country’s socialist party, the CCF. The Chileans and Brazilians were strongly influenced by Latin American socialism. The French rights tradition of 1791 was represented by René Cassin, who had been General de Gaulle’s lawyer in wartime London. The progressive discourse of the victors of World War II provided the intellectual armature for the drafting committee.

Advertisement

Only five years later, the entire scene had changed. Progressive politics were on the defensive. The Soviet Union had tested a hydrogen bomb. Officials in Czechoslovakia had been killed on orders from Moscow. China had fallen to the Communists; McCarthy was persecuting the liberal internationalists of the previous era; Republican Senator John Bricker fulminated against UN human rights documents as “completely foreign to American law and tradition.” One of John Foster Dulles’s first acts as incoming secretary of state was to pull Mrs. Roosevelt off the human rights committee at the UN, proclaiming that the United States “would not become a party to any human rights treaty approved by the United Nations.” America effectively withdrew all efforts to turn the Declaration into a binding covenant. Successive secretaries of state, from Dulles to Kissinger, regarded human rights as a tedious obstacle to the pursuit of great power politics.

From 1948 until the Helsinki Final Act in 1975, there were two human rights cultures in the world—socialist and capitalist—one giving primacy to social and economic rights, the other putting civil and political rights first. Sterile polemics between these two made a genuinely global human rights culture impossible.

The moment of opportunity in 1948 was, in retrospect, brief indeed. So brief that one might well ask how a global human rights movement managed to emerge. In his well-documented and thorough book, William Korey argues that the global spread of human rights owes much more to nongovernmental organizations, like Amnesty International and Human Rights Watch, than it does to the UN itself or to governments. Even before the Declaration was promulgated, the UN Commission on Human Rights ruled that it “had no power to take any action in regard to any complaints concerning human rights.” This capitulation by member states to the principle of state sovereignty did not stop an earlier group of NGOs such as the Anti-Slavery Society, B’nai B’rith, and the French Fédération pour les droits de l’homme from bringing individual rights cases before UN bodies. Even though these bodies could do nothing, member states were shamed by the publicity. But it was not until the late 1960s that the UN system began to authorize human rights reports critical of specific countries, among them South Africa, Haiti, and Greece under military dictatorship.

William Korey’s valuable study highlights the important role played by a new generation of nongovernmental organizations like Amnesty International, founded in 1961, in forcing the UN system to begin questioning the principle that human rights violations were an internal affair of member states. At first the targets were relatively easy—pariah states like South Africa; the harder targets, like the Soviet Union, remained untouchable until the 1980s. Once again, it was pressure from below, especially American Jewish groups demanding rights of free emigration to Soviet Jewry, that impelled politicians to act—for example, by supporting the Jackson-Vanik Amendment of 1974—and gradually forced human rights onto the agenda of American-USSR summit meetings.

Korey’s work on the pressure exerted by nongovernmental organizations is useful and thorough, but it ignores wider questions, especially that of the link between the global diffusion of human rights and economic globalization. As the global market economy pulverized traditional societies and moralities and drew every corner of the planet into a single economic machine, human rights emerged as the secular creed that the new global middle class needed in order to justify their domination of the new cosmopolitan order.

This is the case put by Kenneth Anderson, once a Human Rights Watch activist and now an apparently disillusioned law professor at American University. “Given the class interest of the internationalist class carrying out this agenda,” he writes, “the claim to universalism is a sham. The universalism is mere globalism and a globalism, moreover, whose key terms are established by capital.”4 This seems wrongheaded to me. The NGO activists who devote their lives to challenging the employment practices of global giants like Nike and Shell would be astonished to discover that they were serving the interests of global capital all along. Anderson conflates globalism and internationalism and mixes up two classes, the free market globalists and the human rights internationalists, whose interests and values are in conflict.

It isn’t necessary to share Anderson’s perspective to accept that the emergence of the global market has assisted the diffusion of human rights, since markets break down traditional social structures and encourage the emergence of assertive temperaments. But while markets do create individuals, as buyers and sellers of goods and labor, these individuals often want human rights precisely to protect them from the indignities and indecencies of the market. Moreover, the dignity such a person is seeking to protect is not necessarily derived from Western models. The women in Kabul who come to Western human rights agencies seeking their protection from the Taliban militias do not want to cease being Muslim wives and mothers; they want to combine respect for their traditions with certain “universal” prerogatives, like the right to an education or professional health care provided by a woman. They hope the agencies will defend them against being beaten and persecuted for claiming such rights.

Anderson writes as if human rights are always imposed from the top down by an international elite bent on “saving the world.” He ignores the extent to which the demand for human rights is issuing from the bottom up. In Pakistan, it is local human rights groups, not international agencies, who are leading the fight to defend poor country women from being burned alive when they disobey their husbands; it is local Islamic women who are criticizing the grotesque way in which Islam is being distorted to provide justification for such gross physical abuse.5 Human rights has gone global, but it has also gone local.

Non-Western critics of human rights language reproach it for individualism, for emphasizing individual entitlements at the expense of social duties, but this could be precisely what renders it so attractive, for example, to women trapped in societies in which oppression by men is sustained by custom, law, and religion. It is simply not the case, as Islamic and Asian critics contend, that human rights force the Western way of life upon their societies. For all their individualism, human rights do not require adherents to jettison their other cultural attachments. Jack Donnelly, one of the most respected human rights philosophers, argues that “a human rights approach assumes that people probably are best suited, and in any case are entitled, to choose the good life for themselves.”6 What the Declaration does mandate is the right to choose, and specifically the right to leave when choice is denied. The global diffusion of rights language would never have occurred had these not been authentically attractive propositions to millions of people, especially women, in theocratic, traditional, or patriarchal societies.

The same role of bottom-up pressure was at work in the rights revolution which swept through Eastern Europe in the 1970s and 1980s. By the early 1970s the foreign ministries of Europe had made their peace with the division of the continent. Indeed, the Helsinki Final Act of 1975 was explicitly designed to give a Western seal of approval to the Soviet sphere of interest. As a quid pro quo, Western governments pressed for a human rights “basket” in the final agreement. The content of that basket, as William Bundy explains in his book on Kissinger’s diplomacy, did not come from foreign ministries, “but in large part from private organizations in ‘civil society,’ with roots and international ties already developing on their own,” such organizations as the National Conference on Soviet Jewry, Freedom House, and US Helsinki Watch.7 A Soviet leadership desperate to secure Western acquiescence in the Yalta settlement conceded the right of Eastern Europeans to have human rights organizations, without realizing that this opened the door for Yuri Orlov’s Moscow Helsinki Watch Group, Polish Solidarity, the Czech Charter 77, and the other rights movements which eventually brought the Soviet system crashing down.8 The Helsinki story suggests that the bottom-up demand for human rights has had a political impact which neither Western governments nor Anderson’s global class has been able to control.

Looking back now, we can see that Helsinki also represented the capitulation of the socialist version of rights to the universalizing ambitions of its Western competitor. After Helsinki, there were no longer two rights cultures in the world, but one. Yet as human rights has passed from an insurgent creed to an official ideology, it has lost some of its moral power. Democratic leaders pretend to “advance” the human rights agenda; and the world’s many tyrants pretend to listen. As President Clinton has found to his cost, encouraging human rights activism in China on his last visit seems only to have resulted in the largest crackdown on dissent since Tienanmen Square in 1989.

At fifty, human rights is in what Morsink calls a “mid-life crisis.” The NGOs make up a large, amorphous movement, but many of its components are middle-aged and office-bound; their energies are dissipated in interagency competition for money and publicity. The coinage of public shame—the essential resource of the NGO movement—has been debased.

The coinage has also been inflated with demands that the West acknowledge a right to development which would mandate the transfer of resources from rich countries to poor ones. Debt relief is a good cause, and so are campaigns to increase the ridiculously low figures which rich countries devote to aid and development in poor ones. But good causes are not made better by confusing needs with rights. Rights inflation reduces the real value of rights language.

Human rights treaties, agencies, and instruments multiply and yet the volume and scale of human rights abuses keep pace. In part, this is a problem of success—abuses are now more visible—but it is also a sign of failure. No era has ever been so conscious of the gap between what it practices and what it preaches. Cambodia, Sudan, Bosnia, Chechnya, and now Kosovo show that the diffusion of a global human rights consciousness has not managed to halt the spread of what former UN Secretary General Boutros Boutros-Ghali once called “the culture of death.”9

2.

The “mid-life crisis” of human rights is not just about the discrepancy between what states say and do. There is also a philosophical crisis: a sense that the silences in the Universal Declaration need to be confronted. The secularism of its premises is ever more open to doubt in a world of resurgent religious conviction.

Though the challenge to human rights from radical Islam and proponents of Asian values has attracted most of the attention, increasingly we hear challenges from within the Western tradition itself. The rights language of America’s Founding Fathers was religious, and it is from American philosophical thinkers that the challenge to the secular premises of the Universal Declaration has been most direct. Michael Perry, a legal philosopher at Wake Forest University, believes that the idea of human rights is “ineliminably religious.”10 Unless you think, he says, that each human being is sacred, there seems no persuasive reason to believe that their dignity should be protected with rights. Max Stackhouse, a Princeton theologian concerned with public affairs, argues that the idea of human rights has to be grounded in the idea of God, or at least the idea of “transcendent moral laws.” Human rights needs a theology in order to explain, in the first place, why human beings have “the right to have rights.” 11

What seems to be bothering these thinkers is the suspicion that human rights are just another form of arrogant make-believe, putting Man on a pedestal when he should be down in the mud where he deserves to be. If human rights exist to define and uphold limits to the abuse of human beings, then their underlying philosophy, religiously inclined thinkers imply, ought to define man as a beast in need of restraint. Instead human rights make Man the measure of all things, and from a religious point of view this is a form of idolatry.

Yet it is not clear why human rights need the idea of the sacred at all. Why do we need an idea of God in order to believe that human beings should not be beaten, tortured, coerced, indoctrinated, or in any way sacrificed against their will? These intuitions derive from our own experience of pain and our capacity to imagine the pain of others. Believing that men are sacred does not necessarily strengthen these injunctions. The reverse is often true: acts of torture or persecution are frequently justified as serving some sacred purpose. Indeed the strength of a purely secular ethics is its insistence that there are no sacred purposes which can ever justify the inhuman use of human beings.

A secular defense of human rights depends on the idea of moral reciprocity: that we cannot conceive of any circumstances in which we or anyone we know would wish to be abused in mind or body. That we are capable of this thought experiment—i.e., that we possess the faculty of imagining the pain and degradation done to other human beings as if it were our own—is simply a fact about us as a species. Being capable of such empathy, we all possess a conscience, and because we do, we wish to be free to make up our own minds and express our own justifications for our views. The fact that there are many humans who remain indifferent to the pain of others does not imply they cannot imagine it or prove that they do not possess a conscience, only that this conscience is free to do both good and evil. Such natural facts about human beings provide the grounds for an entitlement to protection from physical and mental abuse and to the right to freedom of thought and speech.

While such a conception only provides the basis for a core of civil and political rights, the Nobel laureate Amartya Sen argues, such rights, if guaranteed, empower human beings to defend a wider range of entitlements. The right to freedom of speech is not, as Brecht and the Marxist tradition maintained, a lapidary bourgeois luxury, but may be the precondition for having any other rights at all, not to mention the very capacity to survive. “No substantial famine has ever occurred,” Sen observes, “in any country with a democratic form of government and a relatively free press.” The Great Leap Forward in China, in which between twenty-three to thirty million people perished as a result of irrational government policies implacably pursued in the face of their obvious failure, would never have been allowed to take place in a country with the self-correcting mechanisms of a free press and political opposition.12 So much for the argument so often heard in Asia that a people’s “right to development,” to economic progress, should come before their right to free speech and democratic government.

Such a secular defense of human rights—based on practical historical experience and a minimalist anthropology—will necessarily leave religious thinkers unsatisfied. For them secular humanism is the contingent product of late European civilization and is unlikely to command assent in non-European and nonsecular cultures. Accordingly, in this fiftieth anniversary year, a lot of effort has been expended in proving that the moral foundations of the Universal Declaration are derived from the tenets of all the world’s major religions. The Universal Declaration is then reinterpreted as the summing up of the accumulated moral wisdom of the ages. Paul Gordon Lauren begins his history of the idea of human rights with an inventory of the world’s religions, concluding with the claim that “the moral worth of each person is a belief that no single civilization, or people, or nation, or geographical area, or even century can claim uniquely as its own.”

This religious syncretism is innocuous as historical or inspirational rhetoric. But as Lauren himself concedes, only Western culture turned widely shared propositions about human dignity and equality into a working doctrine of rights. This doctrine didn’t originate in Jeddah or Beijing, but in Amsterdam, Siena, and London, wherever Europeans sought to defend the liberties and privileges of their cities and estates against the nobility and the emerging national state.

To point out the European origins of rights is not to endorse Western cultural imperialism. Historical priority doesn’t confer moral superiority. As Jack Donnelly points out, the Declaration’s historical function was not to universalize European values but to put certain of them—racism, sexism, and anti-Semitism for example—under eternal ban. Non-Western foes of human rights take proclamations of “universality” as an example of Western arrogance and insensitivity. But universality properly means consistency: the West is obliged to practice what it preaches. This puts the West, no less than the rest of the world, on permanent trial. Genuinely “universal” human rights regimes might well arraign the death penalty statutes enacted by twenty-eight American states no less than the sharia law which prescribes death by stoning for adultery.

While the moral dispute between “the West” and “the Rest” occupies the most attention, the really interesting new development is how the moral unanimity of the West itself is beginning to fracture. American rights discourse once belonged to the common European natural law tradition, but this sense of a common anchorage now competes with a growing sense of moral and legal exceptionalism. Such exceptionalism can express itself as rights narcissism, a conviction that no international legal statute has anything to teach the land of Jefferson and Lincoln. This narcissism is accentuated by the heady experience of being the world’s most successful nation and the only superpower, quick to use the language of human rights in criticizing some countries, while ignoring human rights abuses in other countries where it sees its other interests at stake.

A further factor is the strong impact of evangelical religion on American politics. Where American policy now leads in the human rights field is in demanding religious freedom for Christian minorities in places like southern Sudan and China. Such demands are certainly justified in view of the very real persecution of Christians and other religious believers in both countries, a persecution described in Nina Shea’s In the Lion’s Den. But no other Western society lets its human rights policy be so strongly driven by its own religious minorities. The risk of such a moral position is that it may tend to limit its concerns to the fate of fellow believers.

American human rights policy, therefore, is increasingly distinctive and increasingly paradoxical: a nation with a great rights tradition that leads the world in denouncing human rights violations but which behaves like a rogue state in relation to international legal conventions. America was the last to ratify the Genocide Convention, the only one still not to have ratified the Convention on the Rights of the Child. It is the only advanced Western country that maintains the death penalty, and the only country besides Libya, Saudi Arabia, Iran, and China that still executes adolescents, the mentally retarded, and the mentally ill. American indifference to international legal norms infuriates its allies. Both Canada and Paraguay have recently protested the refusal of American states to allow foreign nationals on death row access to embassy or consular representation as provided by the Vienna Convention. In the case of a Canadian awaiting execution for murder in Texas, it was claimed that such access might have enabled the defendant to establish an alibi.

Both Human Rights Watch and Amnesty have shown the extent to which international human rights monitors are denied access to American prisons and places of detention and how the US government ignores international reports on American rights violations at home, particularly in its prisons and in the brutality of local police, while championing universality of human rights norms abroad.

America has also led the opposition to establishing a permanent international criminal court to try crimes against humanity. At the UN conference in Rome which voted in July to create the court, the United States, along with Iran, Iraq, China, Libya, Algeria, and Sudan, voted against what might just prove the most important new human rights institution in the next century. American officials maintain that defendants such as William Calley could not hope to get a fair trial in an international tribunal. The problem, of course, is that this dooms the tribunal, since its effectiveness depends on universal jurisdiction. Even when the US demanded, and got, something close to immunity from prosecution of its citizens in return for support of the treaty, it voted against it.13

Opposition to the permanent international criminal court is not confined to isolationist American senators like Jesse Helms. It is also opposed by people who think of themselves as committed internationalists, like the writer David Rieff, who called for vigorous American intervention to stop the Bosnian war. In a recent article, he charges, with some justice, that the concessions made to the Americans have gutted the international criminal court.14 Rogue states who do not ratify the treaty, whether Iraq or the United States, will not accept the court’s jurisdiction or surrender their war criminals to it. More generally, Rieff objects to the very idea that legal recourse to courts and tribunals is an adequate response to the sheer horror of the human rights violations in Cambodia, Rwanda, Bosnia, and Sudan. In his view the whole premise of building an international legal order based on universal human rights norms and backed up by tribunals is flawed, since such norms have no impact whatever in dissuading dictators and ethnic gunmen from using terror to achieve their ends.

The only reliable dissuasion, according to Rieff, is force or the threat of force by the United States and its allies. More treaties, more tribunals, more human rights consciousness, more UN organizations mean little or nothing, in the absence of a super-power clearly determined to stop ethnic cleansing, genocide, or territorial aggression. Indeed, Rieff argues, those who campaign for the court seem to believe that judicial dissuasion can substitute for the effective use of military force in stopping humanitarian abuses. “It is,” he writes, “the court which is the counsel of despair. Its real rationale derives from the hope that, somehow, the law can rescue us from situations from which politics and statecraft have failed to deliver us.”

Rieff is not just questioning the international court; he is casting into doubt the relevance of human rights norms and instruments in keeping barbarism at bay in the contemporary world. But his reasoning seems to me flawed. Even if we accept that human rights norms do not deter, we need not conclude that they are useless. We continue to believe in the rule of law within nation-states even when our domestic civil and criminal laws fail to deter. Between Vengeance and Forgiveness, Martha Minow’s nuanced, subtle, and well-written review of the work of international tribunals, from Nuremberg to Arusha, shows that law honored in the breach rather than in the observance is still worth having. Her conclusions are properly cautious: “I do not think it wise to claim that international and domestic prosecutions for war crimes and other horrors themselves create an international moral and legal order, prevent genocides, or forge the political transformation of previously oppressive regimes.”

Thus far she agrees with Rieff, and so would any experienced observer. But she goes on to defend international tribunals, despite such limitations. They are valuable, she argues, because when they punish criminals, they also affirm, condemn, purge, and purify. They also establish concrete truths, which make it more difficult for future regimes to falsify the historical record. The successful prosecutions of senior Rwandan officials for genocide at the Arusha Tribunal and the convictions of war criminals from all sides of the Bosnian conflict at The Hague did in fact break the “cycle of impunity,” at least for these particular barbarians.

In spite of what Rieff claims, no one who supports an international tribunal believes that it can be an effective substitute for political intervention. By itself, it can only try individuals, but over the longer term, successful prosecutions might alter the balance of customary international law against nonintervention in the internal affairs of states. The long-term historical significance of the rights revolution of the last fifty years is that it has begun to erode the sanctity of state sovereignty and to justify effective political and military intervention. Would there have been American intervention in Bosnia without nearly fifty years of accumulated international opinion to the effect that there are crimes against humanity and violations of human rights which must be punished wherever they arise? Would there be a safe haven for the Kurds in northern Iraq? Would we be in Kosovo?

Rieff is right to be skeptical about the internationalist rhetoric that talks about an “international community” and a “global conscience” based on human rights. Fifty years after the Universal Declaration, state sovereignty remains the main pillar of the international system. It also remains the case that human rights are best protected not by international treaty but by the constitutions of democratic states. International human rights monitoring, in states which have collapsed or in states with authoritarian governments, is a poor substitute for the human rights protection which comes when the people themselves can elect a government they trust. But poor as it may be, that substitute may be the only remedy available. Until legitimate authority can be consolidated in authoritarian or war-torn states, ordinary people will continue to depend for their lives and their liberties on such international protection as NGOs, the UN system, and the global human rights movement can provide.

Rieff’s disillusion with the UN and with human rights activism makes him nostalgic for a Westphalian order of unlimited state sovereignty controlled by American power. In such an order, if there are crimes against humanity to punish, it would be up to American cruise missiles, and, very occasionally, American Marines, to do so. This may be a comfortable thought for Americans, but it leaves even America’s friends uneasy. The challenge ahead is how to define a right of intervention in the affairs of another state which is not so broad as to license American imperialism and not so narrow as to require us to become bystanders to acts of horror.

Aryeh Neier, for many years one of the most respected American human rights activists, makes a strong case for a permanent tribunal in his wide-ranging and authoritative survey of the international response to war crimes. He is as aware as Rieff is of the weaknesses of the court, especially the power of the UN Security Council to bar it from conducting investigations and pursuing indictments. But if it were successful in establishing its prosecutorial independence of the great powers and bringing prosecutions, it would contribute to the emergence of an international system which denies safe havens to the future Pinochets and Pol Pots of this world.

Because of this curious alliance between right-wing isolationists like Senator Helms and disillusioned activists like Rieff, America finds itself standing alone in an emerging international order based on universal human rights norms and international tribunals. This puts American human rights activists, like Aryeh Neier and Kenneth Roth, as much at odds with their own society as they are with rights violators abroad. At home, Human Rights Watch has joined with Amnesty International and the ACLU in opposing regionally inconsistent enforcement of the death penalty—enforcement that disproportionately affects racial minorities—knowing only too well that, for purposes of this argument at least, they are joining in an essentially European consensus against the moral intuitions of their own democracy. Of all the ironies in the history of human rights since the Declaration, the one which would most astonish Mrs. Roosevelt is the extent to which her own country is now the odd one out.

In the next fifty years, we can expect to see the moral consensus which sustained the Universal Declaration in 1948 splintering still further. For all the rhetoric about common values, the distance between America and Europe on rights questions is growing, just as the distance between the West and the Rest is bound to grow, too. This does not mean the end of the human rights movement, but its belated coming of age, its recognition that we live in a plural world of cultures which have a right to equal consideration in the argument about what we can and cannot, should and should not do to other human beings.

In this argument, the ground we share may actually be quite limited: not much more than the basic intuition that what is pain and humiliation for you is bound to be pain and humiliation for me. But this is already something. In such a future, shared among equals, rights are not the universal credo of a global society, not a secular religion, but something much more limited and yet just as valuable: the common ground on which our arguments can begin. The chief argument is over which means we choose to pursue our agreed ends. The weakness of human rights as a language is that it moralizes political ends while hobbling us in our choice of means. There are times, and Kosovo is one, when we need to be as ruthless and determined in our choice of means as we have been high-minded in our choice of ends.

This Issue

May 20, 1999