|
Journal of Social Issues , Fall 1995 v51 n3 p55(11)When is 'obedience' obedience? Conceptual and historical commentary.Neil Lutsky. Author's Abstract: COPYRIGHT Society for the Psychological Study of Reporting Servic 1995This article reassesses the role of obedience to authority in the Milgram experiment and in the Holocaust. I argue that the term "obedience" can be used to both describe and explain the behavior of subjects in Milgram's experiment, and that a failure to distinguish the two uses conceptually has led to an inflated sense of the extent to which Milgram's experiment demonstrates underlying obedience to authority. The article also reviews empirical evidence and alternative analyses to show that when authority influences behavior in the experiment, it may do so for reasons other than the subjects' felt obligation to obey. Finally, I suggest that contemporary history presents a more complex and problematic view of the Holocaust than that implied by social psychology's application of obedience to authority.Full Text: COPYRIGHT Society for the Psychological Study of Reporting Servic 1995 Those who are far away cannot imagine our bitter situation. They will not understand and will not believe that day after day thousands of men, women, and children, innocent of any crime, were taken to their death. . . . Why did this happen? And why is the whole world deaf to our screams? (Abraham Lewin, A Cup of Tears: A Diary of the Warsaw Ghetto, 1988, p. 53)Introduction These questions, posed by Abraham Lewin in his diary of life and death in the Warsaw Ghetto (posthumously published in 1988), haunt a social psychology of our times. To its enduring credit, the work of Stanley Milgram on what we commonly call obedience to authority stands out as one of social psychology's signal attempts to address Lewin's pleas, as recent scholarship on the Holocaust demonstrates (see, e.g., Bauman, 1989; Blass, 1993; Browning, 1992; Katz, 1993).Social scientists often take Milgram's research to suggest how strongly individuals adhere to a felt duty to obey, so much so that normal people can commit monstrously inhumane acts at the direction of authority. For example, Holocaust historian Christopher Browning (1992) cites Milgram's conception of obedience as "deference" in his attempt to explain the murders of Jews in Polish villages by the "ordinary men" of a German Order Police unit. And sociologist Zygmunt Bauman argues Milgram demonstrated that "cruelty . . . correlates very strongly indeed with the relationship of authority and subordination" (1989, p. 153). Milgram himself attributed "the extreme willingness of adults to go to almost any lengths on the command of an authority" (1974, p. 5) to their "sense of obligation" (1974, p. 6).In what follows, I want to suggest that this emphasis on obligatory obedience in the Milgram experiment has led us astray in understanding behavior in the experiment, obedience outside the laboratory, and destructive behavior in the Holocaust. Specifically, I will advance the following four related claims:1. " Obedience" as a description must be carefully distinguished from obedience as an explanation. Although the behavior of subjects in the Milgram experiment can be described as obedient in the specific sense that subjects ultimately tended to act in a manner consistent with the experimenter's instructions, that does not mean that obedience to authority explains why subjects acted as they did.2. Behavior described as obedient may result from various influences of an authority. These include respect for the authority's presumed expertise, awareness of the authority's potential exercise of reward and coercive power, or felt obligation to obey a legitimate authority. Milgram emphasizes the last of these, but empirical evidence challenges his account. I argue that obedience derives from obligation only under highly unusual circumstances, although obedience is often justified after the fact in terms of obligation.3. Behavior described as obedient may also result from common social influences that are not specific to the presence of authority. I highlight two examples in the Milgram experiment: (a) the impact of subjects' inferences from the behavior of others, and (b) the influence of the constraints and vagaries of social interaction with the experimenter. 4. The model of obligation-based obedience may convey a misleading implicit history and explicit psychology of the Holocaust. Although obligatory obedience to commands from above played an undeniable role in the Nazi regime, historical scholarship also documents the extent to which the direction and implementation of the Holocaust were forged from below.Although space limitations make it impossible to treat these points as thoroughly as they deserve, I hope what follows calls into question prevalent assumptions about the nature and role of obedience to authority in the laboratory and in life.Obedience as a Description Distinguished from Obedience as an ExplanationWhat does it mean to label studies in the tradition of Milgram's as " obedience" studies? This is a fundamental and persistent question in the literature on the Milgram experiments (e.g., Milgram, 1965a; Miller, 1986), one that echoes through discussions in the present issue (e.g., Hamilton & Sanders; Miller; Miller, Collins, & Brief). Most often, treatments of this labeling question treat obedience as a form of social influence and contrast it with alternatives such as conformity or compliance. However, I want to draw attention first to a more basic conceptual contrast: between the use of "obedience" as a description of behavior in the Milgram experiment and its use as an explanation of that behavior.When an individual acts in a manner that is consistent with a properly authorized command, instruction, or rule, we commonly describe that behavior as obedient to authority. Milgram justified his use of the label similarly when he argued that obedience occurs when a person in a subordinate position in a hierarchical social relationship "does what another person tells him to do" (1965a, p. 58). In the context of the Milgram experiment, a subject's obedient behavior, operationalized as the number of serially increasing shocks that subject administers, represents an outcome consistent with the study's specified procedures (in most conditions) and the experimenter's ongoing instructions and prods. Most subjects in Milgram's experiments and in basic replications of those experiments (reviewed by Miller, 1986) were obedient in this descriptive sense.What we call the influences responsible for why people come to act in a manner that matches the specifications of an authority's order or instruction represents a different kind of labeling problem from the descriptive one just considered. That requires a close analysis of whether individuals encountering authority act as they do because of the authority's general influence and, if so, due to distinctly authoritative features of that influence characteristic of obedience. In 1965, Milgram differentiated such descriptive and explanatory senses when he wrote, "to obey and to disobey, as used here, refer to the subject's overt action only, and carry no implication for the motive or experiential states accompanying the action" (1965a, p. 58). Nonetheless, the term "obedience" easily lends itself to assumptions about why individuals act as they do in the face of authority." Obedience" Explained by ObedienceBy 1974, Milgram not only described how subjects acted when instructed to shock a protesting victim in his experiment as obedient; he also explained that behavior as deriving, fundamentally, from a human disposition to be obedient under appropriate circumstances. To be sure, Milgram continued to highlight other social and situational influences as well (e.g., the sequential nature of the action, interpersonal etiquette), but he often characterized these as binding individuals to the stipulations of their obedient role. For Milgram (1974), behavior is due to obedience to authority when it derives from an obligation to obey that is associated with occupancy of a particular type of role, that of an agent in a legitimate authority-agent relationship (see also Kelman & Hamilton, 1989). This state of perceived obligation, which Milgram termed the "agentic state," theoretically results in psychological attitudes (e.g., a heightened attention to the authority, a focused drive to fulfill instructions competently, and a shift of personal responsibility to the authority) that lead individuals to act as they have been ordered by an authority. According to Kelman and Hamilton (1989) and Milgram (1974), such obligatory obedience takes precedence over personal preference or, in Milgram's terms, normal autonomous functioning.Although it is unlikely that subjects in Milgram's experiments held a previously instilled obligation to obey research scientists, subjects may have generalized an obligation to obey, learned in more common authority-agent role relationships, to the laboratory setting. There are reasons to hold that such a generalization took place. What Milgram called the "immediate antecedent conditions" of the research situation (e.g., the institutional setting, the demeanor of the experimenter) provided appropriate stimuli for generalization. In addition, descriptive details of deferent behavior during the experiment - subjects' careful attention to the experimenter, extreme politeness, comments about the experimenter's responsibility (see also Sabini & Silver, 1982), and passive disobedience (see also Zimbardo, 1974) - are consistent with Milgram's agency account. Moreover, subjects were willing to administer shocks as a primary function of the presence and behavior of the experimenter (e.g., Milgram, 1965a) but not in response to equivalent actions of an "ordinary man" in quasi-control conditions (Milgram, 1974). However, individuals may follow authority for reasons other than their belief that they are obligated to do so. They may be respecting the expertise of authority, what Kelman and Hamilton (1989, pp. 128-129) call professional authority. Or obedience may be due to a common source of influence: the authority's potential exercise of the concrete power to punish transgressors and to reward the obedient. Although Milgram's experimenter was an expert of sorts, he lacked concrete coercive power. This latter fact (but not the former) reinforces Milgram's argument that authority influence was due to subjects' felt obligation to obey.Milgram's casting of authority has been broadly influential in social psychology, even though his "agentic state" terminology has not been widely adopted. When we invoke the notion of "blind obedience," for example, we are most often implicitly attributing behavior to a sense of role-based duty. Authority itself may promote the idea that people obey out of strong obligation, an ideological claim social psychological accounts may unwittingly reinforce. And individuals accused of wrongdoing may seek protection in assertions that they were just following orders or only doing their jobs - claims often presented as elements of obedience made blind by obligation.In practice, obligations to obey may generally be quite difficult to instill and maintain, although they may be stronger and made highly routine in prototypic roles (e.g., military ranks). Obligations often require individuals to take unpleasant, difficult, or dangerous actions. As a result, the development of effective authority obligations typically requires long periods of socialization to clearly defined roles, exposure to well-established patterns of behavior, and ongoing displays of institutional legitimacy - all reinforced by the potential exercise of reward and coercive power. Interestingly, these antecedent conditions are lacking in the Milgram experiment. Does empirical evidence support the assumption that obligatory obedience is centrally responsible for subjects' behavior in the Milgram studies? An important indirect finding challenges Milgram's analysis. Milgram believed agentic subjects would attribute responsibility for their actions to the authority they obeyed. In Milgram's view, "Unable to defy the authority of the experimenter, they [obedient subjects] attribute all responsibility to him" (1974, p. 8). However, Milgram's own responsibility clock data (1974, pp. 203-204) showed that obedient and disobedient subjects attributed the same amount of responsibility to the experimenter. Moreover, Mantell and Panzarella (1976) found no correlation between the amount of responsibility attributed to the experimenter and if and when subjects ended their participation in a Milgram-like study.What also remains unclear within the purview of a parsimonious obligation account is why authority influence is so strongly attenuated by peer behavior, why even those subjects thought to be agentic resist authority influence so persistently (Modigliani & Rochat, this issue), and why authority influence depends so strongly on the immediate presence of the experimenter and on other binding factors. Moreover, although Milgram asserted that obedience "is maintained through the simple assertion by authority that it has the right to exercise control over the person" (1974, p. xiii), interactions represented in Milgram's film of his study (Obedience, 1965b) and in transcripts in his book (e.g., 1974, pp. 48, 51) suggest the reverse, namely that when the experimenter explicitly reminds subjects of their agentic obligations ("You have no other choice"), subjects are most likely to terminate their participation in the study.Milgram also argues that individuals act contrary to their convictions and values in his experiments due to the disengagement of normal psychological functioning specifically brought about by subjects' recognition of their duty to obey. However, social psychological research on numerous influence phenomena (e.g., attitude-action relationships, helping behavior) demonstrates that people may act contrary to their professed values and individual behavioral tendencies under broad circumstances (e.g., Latane & Darley, 1970; Ross & Nisbett, 1991), none of which necessarily involve authority. Given that this is so, perhaps the influence of authority, when it occurs, can also be explained using more general social psychological vocabularies. Toward a Broad Social Psychology of " Obedience"What beyond influences specific to authority may account for obedient behavior in the Milgram experiment and beyond? I find social learning theory (e.g., Mischel, 1973) useful as a framework for organizing answers to this question. That theory points to both past and present factors that may affect how individuals construct and evaluate understandings of the social conditions in which they find themselves, identify and plan possible courses of action, anticipate the consequences of those actions, and assess the difficulties of enacting particular actions. And the theory provides a context within which a wide variety of influences (e.g., Blass, 1991), including many Milgram first identified (e.g., distancing from the victim, embarrassment, the sequential nature of the situation), may make sense of obedient behavior. I will briefly discuss two such influences. In the Milgram experiment, subjects find themselves in unusual, highly stressful, and quickly evolving circumstances (Gilbert, 1981; Nissani, 1990), and they may look to the behavior of others to help them define and evaluate a deeply ambiguous situation (Darley, this issue; Mixon, 1976). The experimenter is a particularly rich model in this setting. He has the same information as the subject, presumed previous experience in the experimental situation, and training with the procedures and equipment. But the experimenter is not acting as if something has gone horribly wrong this time, and even though he bears substantial responsibility for the fate of the learner, the experimenter is not anxious about the learner's condition. Moreover, the experimenter shows no malevolence. Thus, even though subjects have sufficient reason to believe the learner is being harmed by the procedure, the absence of both concern and malevolence on the part of the experimenter undermines a clear definition and ethical evaluation of the situation. Nissani (1990) argued that subjects in the Milgram experiment demonstrated a more general human reluctance to overturn a structure of beliefs that have been challenged by strong evidence. Given that subjects faced internally inconsistent evidence, their difficulty acting upon a clear cognitive and ethical reconceptualization of the situation is even more understandable (see also Darley, this issue; Mixon, 1973; Ross & Nisbett, 1991). Equivalent forms of incredulity are well known in social psychological research and echo in Abraham Lewin's lament, presented at the beginning of this article. Apparently, we find it difficult to recognize that what we take for granted could be otherwise and to believe that evil can have a benign face. A second general social influence on subjects in the Milgram experiment were norms of social interaction. Subjects were constrained not only by interpersonal etiquette (Milgram, 1974, pp. 149-152) but also by the adaptive interpersonal strategies of the experimenter. As Modigliani and Rochat (this issue) stress, the interactions that subjects had with the experimenter varied highly, because what subjects said was not under experimental control and how the experimenter responded conformed only loosely to Milgram's prototypic script (Milgram, 1974, pp. 21-22). When the experimenter was at his interpersonal best, he outwitted a wavering subject by acknowledging the subject's concern and then redirecting the immediate focus of the interaction back to the experimental procedure. Norms then constrained the responding subject to address the experimenter's last statements before possibly bringing attention back to the general state of the learner. Thus, in order to end the experiment, subjects not only had to be willing to reject the experimenter's implicit claims of expertise and decency, they had to have the verbal skills and other resources necessary to overcome the experimenter in decisive but socially acceptable conversation. When left to their own devices, however, most subjects lacked the psychological means and opportunities to realize their goals (see also Ross & Nisbett, 1991). Unable to negotiate their release, subjects were trapped into continuing. Modigliani and Rochat (this issue) suggest how equivalent interaction dynamics may also account for defiance. If a subject happened to utter a concern that the experimenter ignored or responded to rudely (perhaps due to the dictates of the script), then that subject gained a reciprocal opportunity to violate interaction norms and end his or her participation in the study. This amounts to a random conjunction of events; the subject did not anticipate that the experimenter would respond in this way and did not strive to bring that response about. To Jones's observation about the Milgram experiment that "the degree of compliance could be readily understood once the extremely active role of the experimenter was fully detailed" (1985, p. 80), Modigliani and Rochat add that we can begin to understand noncompliance in the same terms. The two sets of influences discussed above illustrate that authority may substantially affect individual behavior for reasons other than an agent's sense of duty, via processes quite common to social behavior. In contrast, when we treat Milgram's studies primarily as obedience experiments, we may miss social psychology's larger relevance to this work and the findings' larger significance for social psychology.History and the Social Psychology of Obedience in the HolocaustWhen the president of the American Historical Association issued a controversial call to his colleagues to employ the tools of psychology - by which he largely meant psychoanalysis - in the discipline of history, he was acknowledging that historical accounts often contain implicit or explicit claims about human psychology (Langer, 1958). Similarly, when social scientists view historical events through the lenses of psychological studies and concepts, we advance implicit and explicit claims about history (see a related argument by Cherry, 1995). Unfortunately, the history psychology presents may offer undergraduates the only learned exposure to the Holocaust they receive. What I want to suggest below is that when our references rely on the model of obligatory obedience in the Milgram experiment, they do not adequately represent the behavior of perpetrators in the Holocaust nor do they prompt new social psychological understandings of that behavior.What is it that psychology teaches about the Holocaust? In 1986, Miller reviewed then-current commentary relating the Milgram experiment to the Holocaust in psychology and cited excerpts from undergraduate texts on the topic. Overall, both sources showed psychologists emphasizing the banality of evil (in Hannah Arendt's oft-cited phrase), the role of "just following orders" or obligation-perpetuated obedience, and the normality as opposed to potentially pathological character of the perpetuators of evil. A review of current textbooks (e.g., Gleitman, 1995; Gray, 1994; Matlin, 1995; cf. Miller, this issue) reveals a similar stress on the role of "blind obedience" in the Holocaust, where the Milgram experiment is applied to make sense of why "good citizens obeyed the commands of Nazi authorities" (Marlin, 1995, p. 606).Regardless of whether the Milgram experiment remains an apt symbol for duty-based obedience, there is little doubt that obedience to authority in all of its forms was a significant and potent force in the Nazi dictatorship. For example, obligations to obey were implicated in why German Order Police followed commands to massacre Jewish civilians (Browning, 1989), how Hitler Youth were indoctrinated (Schirach, 1981), and even, poignantly, why some Italian Jews reported to a detention camp (Levi, 1960). And the Nazis were ruthless in the exercise of coercive power, even if that power was not sharply directed against German soldiers or doctors refusing orders (e.g., Browning, 1989; Lifton, 1986).What an emphasis on obedience slights, however, are voluntary individual and group contributions to Nazi ideology, policy, bureaucracy, technology, and ultimately, inhumanity. Historical scholarship recognizes this problem in the controversy between intentionalists, who view the Holocaust as the product of Hitler's plans and orders, and functionalists, who see the Holocaust as evolving from bureaucratic developments and rivalries, improvisation, individual and group initiatives, and other external conditions and forces (see Marrus, 1987, chap. 3, for a review). The top-down orientation of the intentionalist perspective melds well with social psychology's emphasis on obedience to authority; the functionalist perspective, which is prominent in historical scholarship, does not.Recent scholarship on the actions of psychologists in Nazi Germany illustrates functionalist themes. Geuter (1992), for example, traces how psychologists worked to enhance their profession under Nazi rule. Proctor (1988) reviews the important role that psychiatrists and psychologists played, first, in developing and promoting the racial hygiene movement in Germany prior to the 1930s, and, later, in studying supposed "defects" of Jewish thinking and in implementing sterilization and euthanasia programs for the "genetically defective." It was this euthanasia campaign, in turn, which provided the general model and medical ideology, technology, and actual personnel for the Nazi's first extermination camps (Lifton, 1986; Proctor, 1988). Staub (1989) commented that "Milgram's dramatic demonstration of the power of authority, although of great importance, may have slowed the development of a psychology of genocide, as others came to view obedience as the main source of human destructiveness" (p. 29). It is surprising, certainly, that social psychology, which was so strongly affected by the Holocaust (Cartwright, 1979), does not address the Holocaust more explicitly and richly than it now does. For example, two volumes of The Handbook of Social Psychology fail to index the topics of genocide or the Holocaust, even though the interested reader can find references to countless other "Special Fields and Applications" (but see Lindzey & Aronson's general disclaimer, 1985, p. v). Perhaps if we recognize that obligatory obedience addresses only a part of the tragedy of the Holocaust, we will consider anew why people were attracted to authority (Staub, 1989), how they perceived their circumstances and why they were unwilling or unable to alter them, how individuals responded to peer behavior and the social dilemmas they faced (e.g., Browning, 1989), and why so many people used the opportunity provided by the Nazis to further their own interests, even if that meant tolerating or contributing to genocide. Raul Hilberg concluded that "In the final analysis the destruction of the Jews was not so much a product of laws and commands as it was a matter of spirit, of shared comprehension, of consonance and synchronization" (1985, p. 55). Certainly social psychology, including Milgram's research, has more to contribute to understanding this and to meeting our obligation to Abraham Lewin and those who suffered with him.References Bauman, Z. (1989). Modernity and the Holocaust. Ithaca, NY: Cornell University Press. Blass, T. (1991). Understanding behavior in the Milgram obedience experiment: The role of personality, situations, and their interactions. Journal of Personality and Social Psychology, 60, 398-413.Blass, T. (1993). Psychological perspectives on the perpetrators of the Holocaust: The role of situational pressures, personal dispositions, and their interactions. Holocaust and Genocide Studies, 7, 30-50. Browning, C. R. (1992). Ordinary men: Reserve Police Battalion 101 and the final solution in Poland. New York: HarperCollins. Cartwright, D. (1979). Contemporary social psychology in historical perspective. Social Psychology Quarterly, 42, 82-93. Cherry, F. (1995). The 'stubborn particulars' of social psychology. New York: Routledge. Geuter, U. (1992). The professionalization of psychology in Nazi Germany. Cambridge: Cambridge University Press. Gilbert, S. J. (1981). Another look at the Milgram obedience studies: The role of the gradated series of shocks. Personality and Social Psychology Bulletin, 7, 690-695.Gleitman, H. (1995). Psychology. New York: Norton. Gray, H. (1994). Psychology. New York: Worth. Hilberg, R. (1985). The destruction of the European Jews. New York: Holmes & Meier Publishers. Jones, E. E. (1985). Major developments in social psychology during the past five decades. In G. Lindzey & E. Aronson (Eds.), The handbook of social psychology (Vol. I, pp. 47-107). New York: Random House. Katz, F. E. (1993). Ordinary people and extraordinary evil: A report on the beguilings of evil. New York: State University of New York Press. Kelman, H. C., & Hamilton, V. L. (1989). Crimes of obedience: Toward a social psychology of authority and responsibility. New Haven, CT: Yale University Press.Langer, W. L. (1958). The next assignment. The American Historical Review, 58, 283-304. Latane, B., & Darley, J. M. (1970). The unresponsive bystander: Why doesn't he help? New York: Appleton-Century-Crofts. Levi, P. (1960). If this is a man. London: Abacus. Lewin, A. (1988). A cup of tears: A diary of the Warsaw Ghetto. Oxford: Basil Blackwell. Lifton, R. J. (1986). The Nazi doctors: Medical killing and the psychology of genocide. New York: Basic. Lindzey, G., & Aronson, E. (Eds.). (1985). The handbook of social psychology (Vols. I and II). New York: Random House. Mantell, D. M., & Panzarella, R. (1976). Obedience and responsibility. British Journal of Social and Clinical Psychology, 15, 239-245.Marrus, M. R. (1987). The Holocaust in history. New York: Meridian. Matlin, M. W. (1995). Psychology. Fort Worth, TX: Harcourt Brace. Milgram, S. (1965a). Some conditions of obedience and disobedience to authority. Human Relations, 18, 57-76.Milgram, S. (1965b). Obedience [Film]. (Available from New York University Film Library.)Milgram, S. (1974). Obedience to authority: An experimental view. New York: Harper & Row.Miller, A. G. (1986). The obedience experiments: A case study of controversy in social science. New York: Praeger.
Mixon, D. (1973). Instead of deception. Journal for the Theory of Social Behavior, 2, 145-177. Mixon, D. (1976). Studying feignable behavior. Representative Research in Social Psychology, 7, 89-104. Nissani, M. (1990). A cognitive reinterpretation of Stanley Milgram's observations on obedience to authority. American Psychologist, 45, 1384-1385.Proctor, R. N. (1988). Racial hygiene: Medicine under the Nazis. Cambridge, MA: Harvard University Press. Ross, L., & Nisbett, R. E. (1991). The person and the situation. New York: McGraw-Hill. Sabini, J., & Silver, M. (1982). Moralities of everyday life. Oxford: Oxford University Press. Schirach, B. von (1981). The Hitler Youth. In G. L. Mosse (Ed.), Nazi culture (pp. 294-303). New York: Schocken. Staub, E. (1989). The roots of evil: The origins of genocide and other group violence. Cambridge: Cambridge University Press. Zimbardo, P. G. (1974). On " Obedience to authority." American Psychologist, 29, 566-567.NEIL LUTSKY completed his Ph.D. degree in social psychology at Harvard University. He is Professor of Psychology at Carleton College, where he has taught since 1974. He has written on attitudes toward elderly persons and old age, stigma and stereotyping, personality processes, and the teaching of social psychology. Lutsky currently serves as a consulting editor to the journal Teaching of Psychology. |
|||
|
|||