When nations find themselves in trouble, their difficulties have usually been a long time in the making. In the case of the terrorism that now afflicts the nations of the West, there is a long intellectual history behind it -- one which is rather unflattering to those who see themselves as the main victims of terrorism. The intellectual roots of terrorism lie in three philosophical ideas which, ironically, are peculiarly Western: popular sovereignty, self-determination and ethical consequentialism. The diffusion of political responsibility that results from popular sovereignty, the belief that every group has a right to its own state, and the decline in the belief in absolute human rights have together fostered a hospitable intellectual climate for terrorism. Even opponents of terrorism may feel a certain moral ambivalence when faced with acts of terror.
One reason academics, journalists and politicians have had difficulty in responding to terrorism is that it is hard to define terrorism in such a way that it refers only to one’s opponents’ activities and not also one’s own. As a result, condemnations of terrorism are often seen by neutral observers as hypocritical. This does not mean that moral denunciations of terrorism are not appropriate and mandatory. Terrorist acts are profoundly immoral. In addition, they are not as politically effective as their practitioners claim. One has only to look at the areas of the world where terror has held sway to see that the violence there is typically prolonged by terrorism, sometimes indefinitely, as the opposing sides come to perceive each other as "criminal" and thus as beyond the pale of civilized negotiation.
But while it is correct for the Reagan administration, for example, to condemn terrorism as a means of effecting political and social change, such a denunciation makes sense only in the context of a moral stance that (1) rigidly distinguishes between combatants and noncombatants and (2) rigidly adheres to the principle that innocent people have an absolute right not to be murdered for any reason whatever. Both of these tenets have been steadily eroding since 1940, in the West as much as elsewhere. Despite repeated commitments to a plethora of declarations of human rights, few if any governments are scrupulous in their military policies regarding such rights. In what follows, I shall try to show how we got ourselves into this predicament.
Popular sovereignty. The doctrine of popular sovereignty developed as the profoundly moral idea that human beings are born free and equal and, as such, have a right to an equal share of political power. The slogan "one man, one vote" perfectly expresses the idea that democracy is the fairest of all political systems because it correctly reflects the natural human condition of freedom and equality. However, it has long been observed that popular sovereignty tends to diffuse responsibility for political acts, particularly acts of war. Everything from conscription to the saturation bombing of cities can find a rationale in popular sovereignty. If the people are the state, then is it not their responsibility both to defend it and to bear the burden of attacks upon it? This question has never been satisfactorily answered.
Despite efforts in international law to distinguish between degrees of culpability with regard to politicians, generals and ordinary citizens, policies of direct attacks upon civilians continue to find a rationale in the identification of the citizen with the state -- even if the ordinary citizen is both ignorant of and indifferent to affairs of state. Thus, the principle of popular sovereignty has provided modern states with the moral leverage to nationalize the lives of their citizens in a way that puts them at risk. Terrorists of all stripes use this principle for their own purposes, and they capitalize on the moral ambivalence reflected in the remark: "One man’s terrorist is another man’s freedom fighter."
Self-determination. Self-determination is one of those 19th-century liberal ideas which has worked its way into the primary documents of 20th-century international law, including the United Nations Charter. The principle claims that "a people" has the right to determine its destiny and the disposition of the land upon which it lives without the intervention of outside parties. The principle of self-determination came to the fore after 1945 as a rubric for decolonization.
Ethical consequentialism. The moral tradition that shaped the West is an amalgam of classical and Christian sources. This ethical confluence has been possible despite considerable differences between the two sources because both agree that the good life involves strict adherence to categorical moral principles.
Both Plato and Aristotle insisted that injustice was not permitted as a means of producing good consequences. In the Republic, Plato makes this point in many diverse and intellectually subtle ways. He argued (as did Aristotle) that there are certain basic human values which are simply worth having for their own sake, and that the ultimate consequence of immoral behavior is self-destruction. Plato, in one of the most powerful passages in Western philosophy, describes the decline of the unjust man into the tyrant, the most unhappy of all men.
The main thrust of these classical arguments, then, is that the man of good character is also the only truly happy man. Maintaining such a character will involve avoiding injustice and, in particular, the pitfall of thinking one can do evil in order that good may come of it. Plato understood that such a life is difficult to achieve, and he was extremely pessimistic about the possibility of the masses ever becoming just. The best they could hope for would be to live in a society governed by a just ruler. Nevertheless, he insisted that there are objectively discernible goods, the participation in which constitutes the good life, and that such a life is irretrievably damaged by acts of injustice, even if undertaken for the "best" of reasons.
Plato and Aristotle initiated what was later to be called the natural-law tradition. Central to natural-law thinking is the Platonic insight that it is possible to define objectively what it means to be good at being a person. Just as there are standards of excellence for being a doctor and a teacher, so there are knowable standards of excellence for being human. The good society is one in which people are allowed to conform to these standards.
The Judeo-Christian idea of a transcendent source of all value is consonant with these classical insights. The commandments that govern the life of the Jew and the Christian are strictly categorical in nature, as indeed are most ethical codes based on theistic sources. Friendship with God is closely linked to walking the path of justice; it is understood that to damage any basic human value is to attack the very source of value and being. What Plato understood to be the consequence of injustice -- self-destruction -- the Judeo-Christian tradition understands as the cutting off of oneself from the very source of being.
The absolutist conception of justice was reflected in the medieval theory of the just war. The notion that in war noncombatants must never be made the object of direct attacks is but one instance of the application of the categorical prohibition of murder to the realm of war. As provisions of the just-war theory passed into the developing corpus of international law in the 17th century, they retained their categorical or absolutist character. And, needless to say, the Christian churches continued to promulgate a similar view of justice.
Machiavelli does not make it entirely clear why the preservation of the political order outweighs any other known good, but we may understand his thinking as a response to the rise of the modern, centralized state. In a world of absolute sovereign states, no structure exists to which appeal can be made over the heads of the princes. The state, therefore, becomes the only hope for the survival of any conception of the good life. A transitional figure, Machiavelli reflected the tension between the old and the new ways of thinking about justice. On the one hand, he recognized the good in the traditional sense -- that there are certain qualities of character that are worth having for their own sake, and goods that are self-evident in the sense that no argument or further justification is necessary for them. On the other hand, he believed that necessities of state require the sacrifice of some of these principles (in particular, the prohibition against murder) for a greater good.
In Machiavelli’s account of the prince, we begin to see the outline of a certain type of modern human who rejects the classical warning that acting against the good will irretrievably damage one’s own character, eventually causing one to lose a knowledge of the good altogether. The prince, according to Machiavelli, is a technician in statecraft and, to that extent, beyond good and evil in the conventional sense. Furthermore, the prince rejects the Christian notion of divine providence. The prince must make his own future, even when this involves doing evil; the prince must play God in order to secure the desired outcome. All of this, of course, is "tragically necessary."
Machiavelli’s thought was brought to completion in the 19th century by philosophers like Jeremy Bentham and John Stuart Mill whose work faced up to the pure consequentialism of much modern politics. In its mature 19th-century formulations, consequentialism was a theory devised, in part, to deal with the perceived disappearance of generally agreed-upon moral standards. The skepticism brought on in some quarters by the rise of empiricism, Darwinism and various forms of atheism led to the search for some standard that would unite radically heterogeneous values. Mill and others fixed upon certain subjective ends, styled variously as "happiness" or "pleasure." As the aforementioned belief in divine providence continued to decline, the terrible burden of completely securing the future seemed to fall entirely upon human shoulders. In principle, no possible course of action could be ruled out as wrong or impermissible in itself and no sacrifice of known goods could be regarded as too great if it would secure greater happiness in the future. Thus, in the search for a means of maximizing the good, moral rules lost their categorical force.
Given the pervasiveness of this moral theory and its impact upon the common person, it is no accident that our own century is replete with political movements that require or threaten the destruction of known values in order to create a future of unlimited happiness. The belief in the mutability of moral obligations is one of the main arguments for terrorism. If there are no absolute human rights, the innocent are in danger. "Calculations" about whether or not to kill an innocent person become no more than arguments of advocacy based on hypothetical scenarios of the future. But can we really be reasonably expected to deal with other people on the basis of deciding whether they live or die by trying to project their life prospects for an indeterminate time period?
Terrorists the world over have appropriated concepts and military strategies (consider the nuclear bombing of Hiroshima and the fire bombing of Dresden) that originated in the West. This fact should not, however, in any way debilitate us in our fight against terrorism. No government, no matter what its own past transgressions, should fail to protect its own citizens. If anything positive can be said about this grim and ironic situation, it is that as victims of terrorism we may be forced to rethink our own policies on the use of force (including nuclear force) in order to bring them into line with our moral denunciations of terrorism.
Author:
Dr. Phillips is director of Program for War and Ethics at the University of Connecticut at Hartford.
Source:
http://www.religion-online.org/showarticle.asp?title=1032
One reason academics, journalists and politicians have had difficulty in responding to terrorism is that it is hard to define terrorism in such a way that it refers only to one’s opponents’ activities and not also one’s own. As a result, condemnations of terrorism are often seen by neutral observers as hypocritical. This does not mean that moral denunciations of terrorism are not appropriate and mandatory. Terrorist acts are profoundly immoral. In addition, they are not as politically effective as their practitioners claim. One has only to look at the areas of the world where terror has held sway to see that the violence there is typically prolonged by terrorism, sometimes indefinitely, as the opposing sides come to perceive each other as "criminal" and thus as beyond the pale of civilized negotiation.
But while it is correct for the Reagan administration, for example, to condemn terrorism as a means of effecting political and social change, such a denunciation makes sense only in the context of a moral stance that (1) rigidly distinguishes between combatants and noncombatants and (2) rigidly adheres to the principle that innocent people have an absolute right not to be murdered for any reason whatever. Both of these tenets have been steadily eroding since 1940, in the West as much as elsewhere. Despite repeated commitments to a plethora of declarations of human rights, few if any governments are scrupulous in their military policies regarding such rights. In what follows, I shall try to show how we got ourselves into this predicament.
Popular sovereignty. The doctrine of popular sovereignty developed as the profoundly moral idea that human beings are born free and equal and, as such, have a right to an equal share of political power. The slogan "one man, one vote" perfectly expresses the idea that democracy is the fairest of all political systems because it correctly reflects the natural human condition of freedom and equality. However, it has long been observed that popular sovereignty tends to diffuse responsibility for political acts, particularly acts of war. Everything from conscription to the saturation bombing of cities can find a rationale in popular sovereignty. If the people are the state, then is it not their responsibility both to defend it and to bear the burden of attacks upon it? This question has never been satisfactorily answered.
Despite efforts in international law to distinguish between degrees of culpability with regard to politicians, generals and ordinary citizens, policies of direct attacks upon civilians continue to find a rationale in the identification of the citizen with the state -- even if the ordinary citizen is both ignorant of and indifferent to affairs of state. Thus, the principle of popular sovereignty has provided modern states with the moral leverage to nationalize the lives of their citizens in a way that puts them at risk. Terrorists of all stripes use this principle for their own purposes, and they capitalize on the moral ambivalence reflected in the remark: "One man’s terrorist is another man’s freedom fighter."
Self-determination. Self-determination is one of those 19th-century liberal ideas which has worked its way into the primary documents of 20th-century international law, including the United Nations Charter. The principle claims that "a people" has the right to determine its destiny and the disposition of the land upon which it lives without the intervention of outside parties. The principle of self-determination came to the fore after 1945 as a rubric for decolonization.
Ethical consequentialism. The moral tradition that shaped the West is an amalgam of classical and Christian sources. This ethical confluence has been possible despite considerable differences between the two sources because both agree that the good life involves strict adherence to categorical moral principles.
Both Plato and Aristotle insisted that injustice was not permitted as a means of producing good consequences. In the Republic, Plato makes this point in many diverse and intellectually subtle ways. He argued (as did Aristotle) that there are certain basic human values which are simply worth having for their own sake, and that the ultimate consequence of immoral behavior is self-destruction. Plato, in one of the most powerful passages in Western philosophy, describes the decline of the unjust man into the tyrant, the most unhappy of all men.
The main thrust of these classical arguments, then, is that the man of good character is also the only truly happy man. Maintaining such a character will involve avoiding injustice and, in particular, the pitfall of thinking one can do evil in order that good may come of it. Plato understood that such a life is difficult to achieve, and he was extremely pessimistic about the possibility of the masses ever becoming just. The best they could hope for would be to live in a society governed by a just ruler. Nevertheless, he insisted that there are objectively discernible goods, the participation in which constitutes the good life, and that such a life is irretrievably damaged by acts of injustice, even if undertaken for the "best" of reasons.
Plato and Aristotle initiated what was later to be called the natural-law tradition. Central to natural-law thinking is the Platonic insight that it is possible to define objectively what it means to be good at being a person. Just as there are standards of excellence for being a doctor and a teacher, so there are knowable standards of excellence for being human. The good society is one in which people are allowed to conform to these standards.
The Judeo-Christian idea of a transcendent source of all value is consonant with these classical insights. The commandments that govern the life of the Jew and the Christian are strictly categorical in nature, as indeed are most ethical codes based on theistic sources. Friendship with God is closely linked to walking the path of justice; it is understood that to damage any basic human value is to attack the very source of value and being. What Plato understood to be the consequence of injustice -- self-destruction -- the Judeo-Christian tradition understands as the cutting off of oneself from the very source of being.
The absolutist conception of justice was reflected in the medieval theory of the just war. The notion that in war noncombatants must never be made the object of direct attacks is but one instance of the application of the categorical prohibition of murder to the realm of war. As provisions of the just-war theory passed into the developing corpus of international law in the 17th century, they retained their categorical or absolutist character. And, needless to say, the Christian churches continued to promulgate a similar view of justice.
Machiavelli does not make it entirely clear why the preservation of the political order outweighs any other known good, but we may understand his thinking as a response to the rise of the modern, centralized state. In a world of absolute sovereign states, no structure exists to which appeal can be made over the heads of the princes. The state, therefore, becomes the only hope for the survival of any conception of the good life. A transitional figure, Machiavelli reflected the tension between the old and the new ways of thinking about justice. On the one hand, he recognized the good in the traditional sense -- that there are certain qualities of character that are worth having for their own sake, and goods that are self-evident in the sense that no argument or further justification is necessary for them. On the other hand, he believed that necessities of state require the sacrifice of some of these principles (in particular, the prohibition against murder) for a greater good.
In Machiavelli’s account of the prince, we begin to see the outline of a certain type of modern human who rejects the classical warning that acting against the good will irretrievably damage one’s own character, eventually causing one to lose a knowledge of the good altogether. The prince, according to Machiavelli, is a technician in statecraft and, to that extent, beyond good and evil in the conventional sense. Furthermore, the prince rejects the Christian notion of divine providence. The prince must make his own future, even when this involves doing evil; the prince must play God in order to secure the desired outcome. All of this, of course, is "tragically necessary."
Machiavelli’s thought was brought to completion in the 19th century by philosophers like Jeremy Bentham and John Stuart Mill whose work faced up to the pure consequentialism of much modern politics. In its mature 19th-century formulations, consequentialism was a theory devised, in part, to deal with the perceived disappearance of generally agreed-upon moral standards. The skepticism brought on in some quarters by the rise of empiricism, Darwinism and various forms of atheism led to the search for some standard that would unite radically heterogeneous values. Mill and others fixed upon certain subjective ends, styled variously as "happiness" or "pleasure." As the aforementioned belief in divine providence continued to decline, the terrible burden of completely securing the future seemed to fall entirely upon human shoulders. In principle, no possible course of action could be ruled out as wrong or impermissible in itself and no sacrifice of known goods could be regarded as too great if it would secure greater happiness in the future. Thus, in the search for a means of maximizing the good, moral rules lost their categorical force.
Given the pervasiveness of this moral theory and its impact upon the common person, it is no accident that our own century is replete with political movements that require or threaten the destruction of known values in order to create a future of unlimited happiness. The belief in the mutability of moral obligations is one of the main arguments for terrorism. If there are no absolute human rights, the innocent are in danger. "Calculations" about whether or not to kill an innocent person become no more than arguments of advocacy based on hypothetical scenarios of the future. But can we really be reasonably expected to deal with other people on the basis of deciding whether they live or die by trying to project their life prospects for an indeterminate time period?
Terrorists the world over have appropriated concepts and military strategies (consider the nuclear bombing of Hiroshima and the fire bombing of Dresden) that originated in the West. This fact should not, however, in any way debilitate us in our fight against terrorism. No government, no matter what its own past transgressions, should fail to protect its own citizens. If anything positive can be said about this grim and ironic situation, it is that as victims of terrorism we may be forced to rethink our own policies on the use of force (including nuclear force) in order to bring them into line with our moral denunciations of terrorism.
Author:
Dr. Phillips is director of Program for War and Ethics at the University of Connecticut at Hartford.
Source:
http://www.religion-online.org/showarticle.asp?title=1032
Comments