The Problem .............................................. 2 Previous Research ...................................... 2 Power and Obedience ................................ 2 Power and Deception ................................. 4 Power and Compassion ..

............................ 4 Power and Hypocrisy .................................. 5 Purpose Of Further Research ...................... 7 Hypotheses ................................................. 7 Methods and Procedures ............................ 8 Ethical Concerns ......................................... 9 Weaknesses .............................................. 10 Data Analysis ............................................ 10 Conclusion ................................................ 11

Table of Contents

by Davi Barker

This is the draft of the design for a renegade psychological experiment on obedience to authority, specifically on police brutality. I presented the idea at PorcFest X and took first place in the Agorist Pitch contest. Now I am consulting with numerous like-minded experts. If you are reading this I’d appreciate your help. I am a writer, merchant, and speaker. I am the editor of, the Campaign Navigator of, the proprietor of, but more importantly I am an advocate of peace, independence and liberation from corrupt authority. Unfortunately, I am not much of a scientist, psychologist or ethicist. I’m publishing this rough draft because I’m hoping to solicit general feedback to help me perfect the design. After reading this, if you are interested in supporting this important work, please get in contact with me.

Statement by the author:

This is open source material. Feel free to copy and distribute at will.


We are living in an increasingly militarized society, and I would argue that this has a primarily psychological cause, not merely a political cause. If allowed to continue this could have disastrous consequences, as it has throughout history. Further, I would argue that this problem stems not only from the psychology of authority, but also the psychology of obedience, specifically the tendency not to intervene when authority is corrupt. This sentiment was perhaps most eloquently expressed by Thomas Jefferson in this seldom quoted passage of the Declaration of Independence: “All experience hath shewn, that mankind are more disposed to suffer, while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed.” The militarization of society cannot be fought only with votes, or with cameras, or even with rifles, if the underlying impulses for compliance are not first addressed in the mind of every subject who slavishly accepts their subjugation. That is why the psychology of obedience is not merely a tool, it is a map of the problem itself. No scientific research has done more to expand public understanding of the problem of obedience than the Milgram Experiment and the Stanford Prison Experiment. But ethical concerns raised about their results lead to changes in the APA guidelines, which have made it almost impossible to study the psychology of obedience and authority. There are a few more recent studies, which are far less dramatic because of the new limitations, but the implications of their results are no less startling. After World War II the horrifying details of the Holocaust came to light. Jews, Gypsies, Homosexuals and anyone deemed an enemy of the State were murdered by the Nazis. The robotic refrain from soldiers at the Nuremberg Trials was, “I was just following orders.” Yale University psychologist Stanley Milgram designed an experiment to measure the willingness of psychologically healthy people to obey unethical orders from an authority figure, to discover how such atrocities were possible. His shocking results were published in his book, Obedience to Authority: An Experimental View. In the Milgram Experiment participants were divided into “teachers” and “learners” and placed in separate rooms. They could communicate, but could not see each other. The experimenter instructed the “teachers” to read questions to the “learners” and if they answered incorrectly to administer an elecro-shock of ever increasing voltage. The “teachers” were unaware that the “learners” were actually confederates of the experimenter and the electro-shocks were fake. The “teachers” were the actual subjects

The Problem

Previous Research

Power and Obedience


of the experiment. After a few volt increases the “learner” began to object, to bang on the walls and complain about a heart condition. After some time the “learner” would go silent. If the subject asked to stop the experiment for any reason they were given a succession of verbal prods by the experimenter to continue. “Please continue.” “You must continue.” “The experiment requires that you continue.” Etc. Most continued after being told that they would be absolved of responsibility. Milgram’s contemporaries predicted only 1% of subjects would administer a lethal shock, but were utterly shocked when 65% administered the experiment’s maximum massive 450-volt shock even though every subject expressed some level of objection in doing so. Some began to laugh nervously. Others offered to refund the money they were paid for participating in the experiment. Some exhibited signs of extreme stress once they heard the screams of pain coming from the learner. But the vast majority were willing to administer a lethal jolt of electricity to a complete stranger based upon nothing but the verbal prodding of a scientist in a lab coat. None of those who refused to administer the deadly shock insisted that the experiment itself be terminated. The Stanford Prison Experiment was conducted by psychologist Philip Zimbardo to study the prison environment. Participants were screened to be psychologically healthy and randomly assigned to the role of “prisoner” or “guard” to live in a two week long prison simulation. Guards were given uniforms, mirrored glasses, and wooden batons meant only to establish status. Prisoners were dressed in smocks and addressed only by the numbers they were issued. Guards were instructed only to keep a fixed schedule, and to attempt to make the prisoners feel powerless, but they could not physically harm them. The experiment was halted after only six days after guards began to display cruel, even sadistic behavior including spraying disobedient prisoners with fire extinguishers, depriving them of bedding or restroom privileges, forcing them to go nude and locking them in “solitary confinement” in a dark closet. After an initial revolt, and a brief hunger strike, prisoners developed submissive attitudes, accepting physical abuse, and readily following orders from the “guards” to inflict punishments on each other. They even engaged in horizontal discipline to keep each other in line. One prisoner began showing signs of mental breakdown after only 36 hours, yet they stayed even though they were all made aware that they could stop the experiment at any time. As Zimbardo explained, both prisoners and guards had fully internalized their new identities. Zimbardo ultimately halted the experiment when he realized that his judgment had been compromised by being sucked into his role as “Prison Superintendent” and he’d allowed abuse to continue that could be considered torture. In his book, The Lucifer Effect: Understanding How Good People Turn Evil, he details his findings and how they relate to the torture and prisoner abuse at Abu Ghraib.

Columbia University professor Dana Carney conducted an experiment to discover if “leaders” and “subordinates” experience the same physiological stress while lying. She found that power not only made lying easier, but pleasurable. Participants took personality tests identifying them as “leaders” or “subordinates.” In reality the selection was random, but the fake test created a sense of legitimacy to their assignment. “Leaders” were given an hour of busy work a large executive office. “Subordinates” were given an hour of busy work in small windowless cubicals. Then they engaged in a 10 minute mock negotiation over pay. Afterwards half the participants were given an opportunity to steal $100 if they lied and convinced the lead experimenter that they didn’t have it. The experimenter did not know who had the money. For most people lying elicits negative emotions, cognitive impairment, physiological stress, and nonverbal behavioral cues, which can all be measured. Video of the interviews was reviewed to identify behavioral cues. Saliva samples were tested for increases in the stress hormone cortisol. Tests of reaction time were conducted on the computer to demonstrate cognitive impairment. And a mood survey assessed participants’ emotional states during the experiment. By every measure “subordinates” exhibited all the indicators of deception, but liars in the “leader” class exhibited the exact opposite. By every measure “leaders” who lied were indistinguishable from truth-tellers. In fact, they enjoyed reduced stress levels, increased cognitive function and reported positive emotions. Only “subordinates” reported feeling bad about lying. Professor Carney concluded, “Power will lead to increases in intensity and frequency of lying.” Lying comes easier, and is inherently more pleasurable, to those in authority, even fake authority. In other words, power rewards dishonesty with pleasure. University of Amsterdam psychologist Gerben A. Van Kleef conducted an experiment to identify how power influences emotional reactions to the suffering of others. Participants filled out a questionnaire about their own sense of power in their actual lives and were identified as “high-power” and “lower-power” individuals. Then they were randomly paired off to take turns sharing personal stories of great pain, or emotional suffering. During the exchange the stress levels of both participants was measured by electrocardiogram (ECG) machines, and afterward they filled out a second questionnaire

Power and Deception

Power and Compassion

describing their own emotional experience, and what they perceived of their partner’s emotional experienced. You guessed it. Increased stress in the story teller correlated with increased stress in listener for low-power subjects, but not for high-power subjects. In other words, lowpower individuals experienced the suffering of others, but high-power individuals experienced greater detachment. After the experiment high-power listeners correctly identified the emotions of their partners, but self-reported being unmotivated to empathize with their partner. In other words, they saw the emotions of others, but they just didn’t care. After the experiment, researchers inquired about whether participants would like to stay in touch with their partners. As you might expect, the low-power subjects liked the idea, but the high-power subjects didn’t. It has become a cliche that the most outspoken anti-gay politicians are in fact closet homosexuals themselves, and the champions of “traditional family values” are engaged in extramarital affairs. Nothing is more common than the fiscal conservative who demands ridiculous luxuries at the taxpayer’s expense, or the anti-war progressive who takes campaign donations from the military industrial complex. Well, now it seems there’s some science behind the hypocrisy of those in power. Joris Lammers, from Tilburg University, and Adam Galinsky of Kellogg School of Management conducted a battery of five experiments to test how power influences a person’s moral standards, specifically whether they were likely to behave immorally while espousing intolerance for the behavior of others. In each of five experiments the results were about what you’d expect. Powerful people judge others more harshly but cheat more themselves. But in the last experiment they distinguished between legitimate power and illegitimate power and got the opposite results. In the first experiment subjects were randomly assigned to as “high-power” or “lowpower.” To induce these feelings “high-power” subjects were asked to recall an experience where they felt powerful, and “low-power” subjects were asked to recall an experience where they felt powerless. They were asked to rate how immoral they considered cheating, and then they were given an opportunity to cheat at dice. The high-power subjects considered cheating a higher moral infraction than low-power subjects, but were also more likely to cheat themselves. In the second experiment participants conducted a mock-government. Half were randomly assigned as “high-power” roles which gave orders to the half randomly given “low-power” roles. Then each group was asked about minor traffic violations, such as speeding, or rolling through stop signs. As expected, high-power subjects were more

Power and Hypocrisy

likely to bend the rules themselves, but less likely to afford others the same leniency. In the third experiment participants were divided by recalling a personal experience, like in the first. Each group was asked about their feelings about common tax evasions, such as not declaring freelance income. As expected, high-power subjects were more willing to bend the rules themselves, but less likely to afford others the same leniency. In the fourth experiment participants were asked to complete a series of word puzzles. Half the participants were randomly given puzzles containing high-power words, and the other half were given puzzles containing low power words. Then all participants were asked what they’d do if they found an abandoned bike on the side of the road. As in all experiments, even with such an insignificant power disparity, those in the high-power group were more likely to say they would keep the bike, but also that others had an obligation to seek out the rightful owner, or turn the bike over to the police. The fifth and final experiment yielded, by far, the most interesting results. The feeling of power was induced the same as the first and third experiment, where participants describe their own experience of power in their life, with one important distinction. This time the “high-power” class was divided in two. One group was asked to describe an experience of legitimate power, and the other was asked to describe an experience of illegitimate power. The legitimate high-power group showed the same hypocrisy as in the previous four experiments. But those who viewed their power as illegitimate actually gave the opposite results. Researchers dubbed it “hypercrisy.” They were harsher about their own transgressions, and more lenient toward others. This discovery could be a silver bullet against corrupt authority. The researchers speculated that the vicious cycle of power and hypocrisy could be broken by attacking the legitimacy of power, rather than the power itself. As they write in their conclusion: “A question that lies at the heart of the social sciences is how this status-quo (power inequality) is defended and how the powerless come to accept their disadvantaged position. The typical answer is that the state and its rules, regulations, and monopoly on violence coerce the powerless to do so. But this cannot be the whole answer… Our last experiment found that the spiral of inequality can be broken, if the illegitimacy of the power-distribution is revealed. One way to undermine the legitimacy of authority is open revolt, but a more subtle way in which the powerless might curb self enrichment by the powerful is by tainting their reputation, for example by gossiping. If the powerful sense that their unrestrained self enrichment leads to gossiping, derision, and the undermining of their reputation as conscientious leaders.”

This final experiment offers some hope that corrupt authority can not only be stopped, but driven into reverse, not by violence or revolution, but by undermining its legitimacy. Previous experiments have shown that those in authority are more likely to lie, cheat and steal while also being harsher in their judgments of others for doing these things. They feel less compassion for the suffering of others, and are even capable of the torture and murder of innocent people. What’s perhaps most disturbing is that it shows that the problem is not that corrupt people are drawn to positions of authority, but that positions of power breed corruption in people. Human nature is essentially adaptive. If you take an otherwise good person and put them in a role that incentivizes evil they will adapt to the new role. And if you deeply internalize “obedience to authority” as a core personality trait you will become capable of the worst forms of murder, and tolerant of the worst forms of abuse. All of these experiments have been studies of what people are willing to do, or willing to endure. What hasn’t been studied is what people are willing to passively witness, and when people are willing to intervene. This is potentially more important data, because when atrocities are committed by militarized societies the perpetrators are usually a minority of the population, and the victims are usually a minority of the population, but the passive witnesses are the majority, and thereby the most capable of meaningful intervention. Our purpose will be to hopefully create a psychological profile of those willing to intervene against corrupt authority. 1) Given the opportunity, a significant portion of the general population will not intervene in a clear incident unprovoked police brutality. 2) There will be a statistically significant difference between the percentage of people who will intervene in an incident of police brutality, and people who will intervene in an identical incident of brutality by someone in civilian clothes. 3) Demographic information can be discovered which correlates with higher rates of intervention in an incident of police brutality.

Purpose Of Further Research




Back Room

Plant Couch


Waiting Area



We will use a convenience sample of the general population by renting a meeting room in a shopping mall commonly used for consumer surveys. Volunteers will offer shoppers some small reward if they agree to watch a new movie trailer. Participants will be lead down a hall with a visible camera to a waiting area. Volunteers should gesture to the camera to bring attention to it. In the waiting area there will be a couch, a door back to the hallway, and a door to a back room, both labeled “exit.” There will also be a surveillance screen displaying footage of the hall, but no sound. The surveyor will emerge from the back room, at which point the volunteer will exit back into the hall, and be visible on the surveillance screen. The surveyor will ask the participant to fill out a short questionnaire while they set up the trailer. The surveyor will return to the back room while the participant fills out the clip board. The questionnaire will ask relevant demographic data: age, sex, ethnicity, income, education, political affiliation, etc. It will also contain questions about movies. How often they go? What genres they enjoy? What movies have they seen recently? etc. Embedded in that list must be the question, “Are you comfortable watching violent footage?” When the surveyor returns he will also thank what appears to be a previous participant, but is actually a confederate of the experiment. The surveyor asks the participant to wait just a few more minutes as he sets up the video and returns to the back room. The confederate exits through the hallway, leaving the participant alone in the waiting area. When the confederate exits the room the surveillance screen begins playing a recorded

Methods and Procedures

incident of unprovoked assault. The sound of the incident will be played in the hall, to create the illusion that the confederate is being attacked right outside. The brutality will begin with shouting, escalate to shoving, then a beating which could reasonably result in serious injury, until finally the assailant drags the confederate off camera. Half of the participants will see a video of an assailant in a police uniform. The other half will see the same scene with except the assailant is in civilian clothes. The clothing should be changed digitally so there is no disparity in the performance of each scene. If the participant opens the door to the hallway that will be counted as an intervention. If the participant either takes the exit toward the back room, or stays in the waiting area until the end of the footage that will be counted as not intervening. Regardless of the outcome, once the participant has made their choice the illusion will be revealed and the entire scope and purpose of the experiment will be explained. The surveyor will conduct an exit interview. All participants will be asked to complete an emotional survey describing how they felt and what they were thinking during the experience. They will be asked what motivated them to make the choice they did. Participants who intervened will be asked what they intended to do once they entered the scene. Were they going to yell at the assailant? Would they physically intervene? Or record the incident? Participants who took the other exit will be asked where they were going. Were they searching for an exit, or seeking help from the surveyor? Preventing physical harm: There was concern that involving subjects in staged violence could put the confederates at risk of injury if the subject decided to intervene physically. The illusion created by the prerecorded footage, and the use of the door as a measure of intervention was devised as a way to protect everyone from physical harm. Preventing forced witness: There was concern that forcing someone into a situation where their only options were to witness, or to intervene could be unethical. To mitigate this risk the level of brutality in the footage should not exceed what may reasonably be seen in mainstream news, and the waiting area should have a second exit sign above the back door to create a third option of leaving. Preventing trauma: There was concern that in spite of the violent footage waiver in the questionnaire and the other precautions a subject may still find the experience traumatic, or emotionally distressing, especially if they have a personal history of brutality. To remedy this unfortunate result,, if it occurs, someone will be on hand to offer private exit counseling after completion of the experiment. If subjects take this offer, these sessions are not considered part of the exit interview, and will not be part

Ethical Concerns

of any analysis. In addition, every subject will be given the contact information of local counseling services, in case they experience distress after leaving, or even days later. Deception: There was concern that it may be unethical to deceive subjects, both by offering to show them a non existent movie trailer, and by creating the illusion of the assault. Subjects will be fully informed of the scope and purpose of the experiment afterward, however there is no way to avoid deceiving the participants without biasing the results. We regard this as an acceptable risk given the potential value of the study. Sample Quality: A convenience sample from a shopping mall is not a perfect random sample of the general population. It will be weighted by socioeconomic status, age, and lifestyle factors that make one likely to shop in a mall. Victim association: Any apparent demographic or lifestyle information that can be gleaned from the appearance of the victim could impact the decision to intervene. Future studies should include variations where the victim and assailant are of variable race, gender, creed, sexual orientation, age etc. Risk assessment: A police uniform does not only indicate authority, is also indicates weaponry. The civilian assailant should be similarly armed, but it may not be as obvious. Disparity in rates of intervention may be influenced by disparity in perceived risk. The exit interview should be crafted to account for this. Subject Isolation: Subjects may be reluctant to intervene if they are alone, which does not reflect the real world incidents of police brutality. This may bias the data toward non intervention. Future studies should include variations accounting for group dynamics. Contrived Scenario: Knowledge that this is a contrived scenario, and that they may be being observed, even believing it to be a consumer survey, may bias the data toward intervention. Further, if the performances of the actors, or the oddities of layout cause them to suspect a set up, they may doubt the reality of the footage. The relevant statistical data will be the rate of intervention in the group which saw a police assailant, the disparity with the rate of intervention in the group which saw a civilian assailant, and any demographic information which shows a statistically significant difference within each group. Data on emotion and motivation gleaned from the exit survey will be used to identify potential motives, avenues for future research, and general discussion of the issue, but is not directly relevant to the hypotheses.


Data Analysis

If the rate of intervention in an incident of unprovoked police brutality is significantly low that will give us some indication of the severity of the problem of obedience in society, and lend weight to the argument that police militarization is made possible in part by the complacency of civilians. If the rate is surprisingly high it will invalidate this hypothesis and indicate that other causes of militarization should be explored. If the rate of intervention in police brutality is significantly lower than the rate in civilian brutality that will confirm that aggression from authority figures is more tolerated than aggression in general, indicating that authority itself may increase or even incentivise aggression. If no significant disparity between the rates is discovered that will invalidate this hypothesis and also indicate that other causes of militarization should be explored. If specific demographic information is found to be correlated strongly with intervention that will indicate that obedience to authority is learned behavior, and not innate. That should guide avenues for future research aimed at discovering the root of the learned behavior, and how to teach the opposite behavior. If no correlation is discovered that will invalidate this hypothesis and indicate either that obedience to authority is innate, or that weaknesses in the experimental design failed to screen for the causal factors. The raw data will be made public online, and the video of subjects who allow us to publish their image will be made into a documentary which will be available online. We will also solicit interviews with experts in the field to provide commentary and analysis. Even those deeply familiar with the Stanford Prison Experiment and the Milgram Experiment have usually never heard of the less dramatic studies. Devoid of shock value this research doesn’t impact the culture, and so it fails to safeguard society from the dangers of obedience. Changes to the ethical guidelines have essentially neutered research on authority and obedience. It has been relegated to water cooler banter among academics. If discovering the psychological cause of obedience to authority is the key to preventing the militarization of society, than research such as this is the key to avoiding the disastrous consequences of militarization. APA ethical guidelines are not law. They are essentially a criterion for public funding. So, if these ethical guidelines hamstring meaningful research on obedience to authority, for the sake of safeguarding against the disastrous consequences of militarizing society, it is time for us to cast off these restrictions, and devise new ethical guidelines. Further, we must be willing to fund research such as this privately, and to trumpet the results publicly so as to influence the culture directly. It is time to conduct our own renegade psychological experiments, to show the world beyond doubt that power corrupts absolutely, and corrupt power deserves no obedience.


Sign up to vote on this title
UsefulNot useful