Wednesday, May 15, 2013

Enter the Automaton: the phenomenon of hyper-obedience

   


Over a century ago, Henry David Thoreau wrote the famous work “Civil Disobedience,” in which he expounded his views on obedience to authority.  Simply stated, the essay holds that each citizen has the moral and ethical obligation to disobey civil authorities if he perceives that any laws and/or dictates set forth by the government are unjust or immoral.


Thoreau was an idealistic man, imbued with a profound sense of conscience and social responsibility, and his position derives a measure of support from the fact that Thoreau himself spent a night in jail for refusing to pay his poll tax on grounds that the right to vote should not be a taxable “commodity.” Henry David Thoreau appears to have been a man who practiced what he preached.


Today, Thoreau is required reading in many American Literature courses.  Presumably, certain elements within the American educational establishment believe that Thoreau’s ideas may be useful towards instilling a sense of ethics into its students, thus creating the morally refined citizens of tomorrow – citizens who are continually questioning and evaluating their government.  Such a person is thought by many to be the ‘ideal citizen’ – the perfect expression of Democracy at work.


This sounds good on paper, but is it a realistic viewpoint? Can we really expect the “average” citizen to defer to conscience and morality if the heavy hand of government – a government turned malevolent - should come to rest upon him?


Consider the work of two prominent research psychologists – the late Dr. Stanely Milgram of Yale University and Dr. Irving Janis, also of Yale.  Both men were concerned with studying the phenomenon of obedience to authority.  What they discovered stands in stark contrast to the idealism of Thoreau.  To summarize their basic conclusions, both have found that ordinary people will readily abandon their sense of moral and ethical responsibility when a superior authority, or authorities, requires them to do so.  Dr. Milgram found that such authority need only be vested in a single authority figure; Janis’ work suggests that authority may also be vested in a cohesive group of individuals, where peer pressure comes to bear.  Despite the subtle differences in their ideas, both men have shed light on the same disturbing phenomenon – obedience to authority even when compliance conflicts with common moral values.     


Dr. Milgram derived his results on the phenomenon of “hyper-obedience” by designing a series of controlled experiments for testing a paid volunteer’s willingness to administer a painful shock to another subject when ordered to do so by an authority figure (in this case, the experimenter).  In the original experiment, documented in Milgram’s “The Perils Of Obedience,” Dr. Milgram solicited volunteers from the mass of students at Yale, ostensibly to participate in a study on learning and punishment.  Two volunteers at a time came into Milgram’s laboratory.  One volunteer was given the role of “teacher,” and his job was to administer an electric shock, in steadily increasing voltages with each failure of the learner to respond correctly.  The shocks ranged from 10-450 volts.
   
Unbeknownst to the teacher, the student strapped in the “electric chair” was a counterfeit – a paid actor.  He was instructed to act out his part quite convincingly, screaming horribly as the voltage increased, pleading for the termination of the experiment.  In reality, no shock was being administered.  But the test was established.  How far would the teacher go, at the prodding of the experimenter?*
   
In the essay, Milgram recounts how he sought predictions about the outcome of the experiment before undertaking it.  Among the psychiatrists that he surveyed, the predictions largely favored “man the moral entity” – the majority of the psychiatrists predicted that most subjects would refuse to obey the commands of the experimenter.
   
So Milgram put his colleagues’ predictions to test, discovering, much to his surprise, that the predictions were wrong.  In the first experiments, nearly sixty percent of the subjects “…obeyed the orders of the experimenter to the end, punishing the victim until they reached the most potent shock available on the generator.”
   
Needless to say, these results were quite “shocking” to both Milgram and his colleagues.  The experiment was repeated in different settings and persons from differing social strata were drawn as volunteers.  But the disturbing results did not go away.  As Milgram stated:   ”…the experiment’s total outcome was the same as we had observed among the students.”
   
This, then, is the essence of Milgram’s work.  It is very similar, as stated previously, to the ideas of Dr. Janis on the phenomenon of groupthink, which Janis has outlined in his essay entitled, “Groupthink: psychological studies of policy decisions and fiascoes.”  In Janis’ own words groupthink constitutes
   
“…social pressures that develop in cohesive groups: in infantry platoons, air crews, therapy groups, seminars, and self study or encounter groups.  Members tend to evolve informal objectives to preserve friendly intra-group relations, and this becomes part of the hidden agenda at their meetings.”
   
Janis cites a few historic events where groupthink may have played a considerable role in the disastrous outcomes – disasters such as Pearl Harbor and the Bay of Pigs fiasco.  I have chosen another example that provides an excellent illustration of the groupthink phenomenon in action:  the top-level meeting held at NASA headquarters on the eve of the space shuttle Challenger’s ill-fated flight in 1986.
   
At that meeting, Dr. Allen McDonald, along with several other Morton-Thiokol engineers, voiced strong objections to the proposed launch because of concerns over an approaching mass of Arctic air.  These objections were summarily overruled when a NASA manager exclaimed: “My God, Thiokol, when do you want me to launch?  Next April?”  Regarding that comment, the March 3, 1986 issue of Newsweek stated


    “As Allen Mc Donald told it, that exasperated protest from NASA was the key moment.  Under the gun, the managers of Morton-Thiokol Inc. overruled their engineers and signed approval for the Challenger to blast off.”
   
Janis stated, as one of the criterion for groupthink in the Bay of Pigs fiasco, that “selective bias was shown in the way the members reacted to information and judgments from experts…they were only interested in facts and opinions that supported their preferred policy.” In the case of the Challenger meeting, the NASA managers’ “preferred policy” was to launch the Space Shuttle on the following morning at 11:30 A.M. Eastern Standard Time.  Dissenting opinions were at odds with this policy, and so the “upstarts” were brought into line through group pressure, and induced to sign off on the launch, despite strong objections (which later proved to be valid).
   
Thus we can see that this succumbing to group pressure, this “groupthink,” is not merely a esoteric theory for philosophic debate, but is, rather, a very real phenomenon with often disastrous results.  And hyper-obedience seems to have its truest expression wherever the superior/subordinate relationship is very clear-cut.  The NASA managers, on their own turf at the Kennedy Space Center, clearly had the upper-hand over the Morton-Thiokol engineers, who were only employees of one of NASA’s many contractors.  Likewise, in Milgram’s experiments, the teacher is clearly subordinate to the authority of the experimenter.  The experimenter is a PhD affiliated with a major university, and the teacher is merely a volunteer being paid a modest sum for his time.  Finally, in Janis’ theory, the individual is clearly subordinate to the authority of the group at large.  One opinion is hard pressed to prevail against ten dissenting ones.
   
Milgram may give us some insight into this potent superior/subordinate relationship when he says, in regard to the teacher’s role in his experiment:  “To extricate himself from his plight, the subject must make a clear break with authority.” In Milgram’s experiment this would translate into the adamant, even angry, refusal to continue shocking the learner.  Likewise, in Janis’ scenario, the “deviant” would have to refuse to back down, despite the disdain of the group at large.  Only in this way would it be possible to alter the directive of the group, or at least free oneself from the burden of responsibility by withdrawing support from the group’s decisions.
   
Of all the rigid superior/subordinate relationships, none are more well-established than those in the nation’s military.  The nation’s armed forces are founded on hyper-obedience.  When an enlisted man is given an order by a superior officer, it is generally accepted that he must obey.  It is only reasonable, then, to expect the problems outlined by Milgram and Janis to manifest in a military situation.**  Indeed, the massacre at My Lai, Vietnam, in which one Lt. Calley directed the murders of  Vietnamese civilians, is a profound example of the power of hyper-obedience.
   
As Stanley Resor’s official report on the massacre documents, the entire company of soldiers under Lt. Calley acted pursuant to his orders, unquestionably gunning down the men, women and children of My Lai Hamlet.  Milgram’s essay “Obedience in Vietnam” includes a CBS interview of one of the My Lai participants, conducted by Mike Wallace, in which the soldier recounts the following gruesome aspect of the massacre:  “He said (Lt. Calley), “I want them dead.”…so I started shooting…I poured about four clips into the group.”
   
Various other aspects of the massacre are recounted in similarly gruesome detail – it’s not essential to cover them all. But one significant quote from the soldier cannot be omitted because it concerns the rationalizations of the soldier in regard to his immoral actions:  “Why did I do it?  Because I felt like I was ordered to do it….”
   
This rationalization was commonly invoked by the teachers in Milgram’s study.  The volunteers tried to free themselves from guilt by stating that they had only done what was required of them – they were simply following instructions.  Indeed, one teacher who was tottering on the edge of disobedience was reassured when the experimenter claimed complete responsibility, and obediently proceeded to shock the learner repeatedly at 450 volts.
   
For some reason, many humans have the capacity to psychologically disengage themselves from a sense of responsibility when committing an immoral act, as long as they are acting pursuant to the orders of an authority figure.  As Milgram states in “Obedience in Vietnam”:
    
 “A substantial proportion of people do what they are told to do, irrespective of the content of the act and without limitations of conscience, so long as they perceive that the command comes from a legitimate authority.”
   
Adolf Eichmann, who engineered the mass-exterminations of millions of innocents in Germany, is another example that Milgram invokes in “Perils.”  When on trial in Israel for his war crimes, he repeatedly claimed that he had only been carrying out orders.
   
But like all others who invoke this rationalization, Eichmann was guilty.  The key point that these individuals overlook is that that the option of disobedience does exist.  Granted, in Eichmann’s case the result of such disobedience would have probably been an appointment with the firing squad.  But the subjects in Milgram’s experiment faced no such life-or-death imperative – they simply administered the shocks in an effort to be polite and cooperative with the experimenter.  Milgram has considered this strange displacement of morality in “Perils”:


    “What is extraordinary is his apparent total indifference to the learner; he hardly takes cognizance of him as a human being.  Meanwhile, he relates to the experimenter in a submissive and courteous fashion….Morality does not disappear – it acquires a radically different focus: the subordinate person feels shame or pride depending on how adequately he has performed the actions called for by authority.”
   
Similarly, in Janis’ theory, the deviant succumbs to the will of the group often just to “…preserve friendly intra-group relations….” Again, the odd displacement of morality is obvious.     


In the case of the My-Lai Massacre, I submit that both Milgram’s and Janis’ versions of hyper-obedience played important roles.  While engaging in unquestioning obedience to a single authority figure, the soldiers were also acting as part of a highly cohesive group.  For any single soldier to disobey orders would have constituted a break with the group as well as Lt. Calley.
   
Hyper-obedience is an intriguing but terrifying phenomenon.  It suggests that seemingly moral persons have the built in capacity to commit truly vile acts under certain circumstances.  In “Obedience in Vietnam” Milgram concludes:


    “The results…raise the possibility that human nature, or – more specifically – the kind of character produced is American Democratic society, cannot be counted on to insulate its citizens from brutality and inhuman treatment at the direction of malevolent authority….”
   
Enter the automaton.
-------
*Milgram was severely criticized by many of his colleagues, among others, on grounds that his experiments were unethical, with potential for causing psychological harm to the volunteers. The author agrees with these criticisms, and the discussion of Milgram’s work in this essay should not be taken as tacit approval of same.


**In the context of military law, there are situations where a subordinate may rightfully disobey a direct order from one’s superior; however, taking the overall nature of war into account along with the harsh realities of the battlefield, the line between “moral and immoral” is a complex issue. For a good, brief discussion on this subject see this article.  

No comments:

Post a Comment