Which of the following principles is included in the American Sociological Association ASA Code of Ethics quizlet?

In 2007, the U.S. military launched a controversial social science program aimed at increasing the military's understanding of local populations and applying this new cultural understanding to the military decision-making process.

the program embedded social scientists in combat units to study the local people and explain cultural and social practices, beliefs, and behaviors—the "human terrain"—to military leaders.

Nonetheless, the program, which was shut down in 2014, was controversial.

The American Anthropological Association passed a resolution against
arguing that the program violated their code of ethics because the U.S. military could use the information gathered on local people to harm them (Glenn, 2007, 2009). A debate in the pages of the American Anthropologist was heated, with some arguing that the Human Terrain System represented the militarization of anthropology and put all anthropologists at risk because people would suspect them of being spies or of working for the U.S. government (Gonzalez, 2007).

Is it ethical for social scientists to study civilians in wartime and provide that information to the military? How would you decide?

Carr, Deborah. The Art and Science of Social Research (First Edition) (p. 69). W. W. Norton & Company. Kindle Edition.

During World War II, the Nazis conducted horrible experiments on prisoners in concentration camps. Denying the humanity of the prisoners, doctors deliberately inflicted pain, suffering, disease, and death on people in gruesome scientific experiments. Many doctors directed these experiments, some of which involved vaccines, pregnancy, and organ transplants. One Nazi doctor, Sigmund Rascher, conducted "freezing experiments" with 300 prisoners at the Dachau concentration camp in order to learn the most effective ways to treat German pilots suffering from hypothermia. Some prisoners were submerged in tubs of ice water for up to five hours while others were taken outside and strapped down naked. Rascher then monitored their heart rate, muscle control, core temperature, and when they lost consciousness. Eighty prisoners died in the experiment (Palmer, 2010). After the war ended, many of those conducting experiments at the concentration camps were prosecuted as war criminals at trials in Nuremberg, Germany.

Carr, Deborah. The Art and Science of Social Research (First Edition) (p. 72). W. W. Norton & Company. Kindle Edition.

From 1932 to 1972, the U.S. Public Health Service conducted a study of syphilis among poor black men in Tuskegee, Alabama, that involved deceiving the subjects about their condition and withholding lifesaving treatment.

Several other unethical biomedical experiments came to light in the postwar period. Perhaps the most notorious is the Tuskegee syphilis experiment. The U.S. Public Health Service conducted a long-running study of syphilis among poor African American men in Tuskegee, Alabama, beginning in 1932. Nearly 400 men with advanced syphilis were recruited for the study, along with a comparison group of 201 non-infected men. When the study began, there was no cure for syphilis, and the purpose of the study was to watch the disease unfold "until autopsy" (Jones, 1993, p. 32).

In the 1940s, antibiotics were discovered that could cure the disease, but none of the men was informed about or treated with the drugs. The men were not told that they had syphilis; instead, they were told they had "bad blood." They were deliberately deceived about the nature of the study, and they were subjected to painful procedures such as spinal taps. To prevent the subjects from being treated, the researchers worked with the local draft board to ensure that they were not drafted during World War II, because the military would have diagnosed the condition. The researchers also arranged with local physicians to withhold treatment from the men. The study continued through 1972, when national media attention caused the Public Health Service to end the study and offer treatment to the few surviving men (Jones, 1993).

While the Tuskegee syphilis experiment involved withholding lifesaving medicine from subjects in order to study the natural course of disease, two other high-profile medical studies involved deliberately infecting people with disease. In the Willowbrook hepatitis study in the early 1950s, children residing in a state facility who had been classified as "mentally retarded" were infected with hepatitis by doctors by feeding them the feces of people who had the disease. In the Jewish Chronic Disease Hospital study in the 1960s, senile elderly patients were deliberately injected with live cancer cells in order to study whether they contracted the disease. All of these cases involved deliberate harm to vulnerable populations: the illiterate poor men in Tuskegee, children with diminished mental capacity in Willowbrook, and older adults with dementia in the Jewish Chronic Disease Hospital (Amdur & Bankert, 2007).

Carr, Deborah. The Art and Science of Social Research (First Edition) (p. 73). W. W. Norton & Company. Kindle Edition.

was a psychology professor at Yale University.
In the 1950s, Milgram wondered how ordinary people could have taken part in Nazi atrocities. He wanted to understand how people could follow the orders of authority figures even when they thought what they were doing was wrong. Milgram designed a laboratory study that involved deception. Subjects were told that the study was about the effects of punishment (in this case, electric shocks) on learning. The subjects were to be the "teachers," and they would have to administer electric shocks to "learners" who did not answer questions correctly. Unbeknownst to the subjects, the learners were laboratory personnel who were pretending to be study participants. The subjects believed they were actually administering shocks of higher and higher voltage each time the learner answered incorrectly. They could not see the learners but heard their screams of pain and their pleas that the experiment end. Many of the subjects became very distressed and tried to stop administering shocks, but the experimenter told them in a calm yet very authoritative voice that "the experiment must continue." Surprisingly, more than half the research subjects administered shocks they thought were high intensity and potentially lethal to the learners. After the experiment was completed, the subjects were told that they had not hurt anyone and that they had been deceived about the study's real purpose (Milgram, 1977). There is a great deal of debate about whether Milgram's study crossed ethical boundaries. Subjects who participated may have had to live with the knowledge that they could have harmed or even killed people. Milgram conducted a 12-month follow-up study of participants, and less than 1% reported that they regretted taking part in the study (Milgram, 1977). Nevertheless, most scholars agree that no IRB would approve Milgram's study today.

Another widely known psychological experiment is Philip Zimbardo's Stanford prison experiment. This 1971 experiment involved 24 male undergraduates recruited through an advertisement in a campus newspaper. Zimbardo built a simulated prison in the basement of Stanford University's psychology building and assigned half the subjects to be "prisoners" and the other half to be "guards." The prisoners were picked up at their homes by police, handcuffed, and brought to the prison. The guards wore uniforms and were given minimal instructions about how to run the prison. Participants knew they were taking part in a simulation, and they signed informed consent forms warning them that some of their basic civil rights would be violated if they were selected to be prisoners. (To see the consent form that the volunteer participants signed, refer to Figure 3.1.) The research quickly spiraled out of control. The guards became increasingly arbitrary, cruel, and sadistic toward the prisoners, and the prisoners became increasingly passive, depressed, and hopeless. Five student prisoners were released early because they developed emotional problems. The experiment was called off before it was due to conclude because a visiting psychologist intervened. Zimbardo later admitted that the research was "unethical because people suffered and others were allowed to inflict pain and humiliation" and that he had become so involved in the experiment that he did not recognize the harm happening to the students. He concluded that he made a mistake and should have stopped the experiment earlier (Zimbardo, Maslach, & Haney, 1999).

Carr, Deborah. The Art and Science of Social Research (First Edition) (p. 74). W. W. Norton & Company. Kindle Edition.

In 1970, Humphreys published a book about his study, Tearoom Trade: A Study of Homosexual Encounters in Public Places, to a very mixed reaction. On the positive side, it had some public policy effects. Tearoom arrests at that time accounted for the majority of arrests for homosexuality in the United States, and as a result of the book, some police departments ceased their raids of public restrooms (Sieber, 1992, p. 8). The book was awarded the prestigious C. Wright Mills Award by the Society for the Study of Social Problems, but it also was roundly criticized for its deceptive methods. Some faculty members even went so far as to try to rescind Humphreys's PhD, and one eminent sociologist, Alvin Gouldner, hit Humphreys in the head in anger about the study, leading Humphreys to be hospitalized (Galliher, Brekhus, & Keys, 2004).

Carr, Deborah. The Art and Science of Social Research (First Edition) (p. 76). W. W. Norton & Company. Kindle Edition.

The protocol also must describe the method that the researchers will use to obtain informed consent.
Usually this involves describing the research to participants, outlining what will happen to them, what the possible risks and benefits are, and what the researcher will offer the participants to deal with any adverse effects.
The informed consent also needs to tell the subjects that they can withdraw from the study at any time with no penalty, and it must provide the name and phone number of an IRB representative whom the subject can call for more information about the study and to verify that it has been approved. Some researchers also provide a list of phone numbers of counselors or websites where study participants can get help if the research questions trigger a negative emotional response. The consent form must be written in jargon-free language to ensure that all subjects can understand it.
Figure 3.2 reproduces a sample consent form. In some cases, the research protocol can be approved without a complete committee discussion. Expedited review is possible when the research does not include vulnerable populations, the data are already collected and de-identified, the research does not touch on sensitive subjects that might cause psychological or physical harm to subjects, and the informed consent process is straightforward and well designed. In such cases, staff members can green-light the research.

Carr, Deborah. The Art and Science of Social Research (First Edition) (p. 78). W. W. Norton & Company. Kindle Edition.

Sometimes we can understand the potential risks and benefits of research only in retrospect.
Consider the Tearoom Trade study. Humphreys believed that he was protecting the people he interviewed by making sure they could not be identified when the results were published. However, he did keep their names, addresses, and license plate numbers for more than a year before he showed up at their homes and interviewed them.
He thought the benefits of his research were
(1) to make clear that not every man who engages in same-sex sexual relations thinks of himself as "homosexual,"
and (2) to describe the relationship between social class and homosexual behaviors (Humphreys, 1970).
Humphreys conducted his research before the onset of the AIDS epidemic. When AIDS first hit the United States, it was considered a "gay disease" because it primarily infected gay men during the first years of the epidemic in the 1980s. Imagine if the AIDS epidemic had hit right when Humphreys published his study. The data that he gathered informed the research community that many men who have sex with men are married to women and do not consider themselves gay. Humphreys's research therefore suggested that the disease could easily travel into the heterosexual community and that public health campaigns should not target only people who identified as gay or bisexual. No one could have easily foreseen these benefits of the research.

Similarly, nobody could have predicted the risks to the men whom Humphreys studied. Imagine the pressure Humphreys would have been under to identify the men if the public had become aware of the AIDS epidemic soon after he completed his study. The wives who had no idea what their husbands were doing in public restrooms on their way home from work would have been at risk of contracting a deadly disease. Humphreys would likely have been pressured to release information to public health professionals, and if he had chosen to withhold that data, he could have been accused of putting the women at great risk.

Carr, Deborah. The Art and Science of Social Research (First Edition) (p. 82). W. W. Norton & Company. Kindle Edition.

Katz argues that IRBs should not review ethnographic studies before they go forward, not only because good research may be prevented, but also because it is impossible for an ethnographer to anticipate all risks. He argues that rather than reviewing studies before they are conducted, IRBs should conduct their reviews after the work is completed and before it is published, when the ethical issues can be confronted in detail and concretely.

In short, researchers should never fall into the trap of believing that because they obtained permission to undertake a particular study, they don't have to continue to make ethical decisions about their research. In the end, researchers are human beings, and studying other human beings will always bring up unanticipated issues and dilemmas. It is important that researchers make ethical decisions, not just legally defined, IRB-sanctioned permissible decisions.

Rik Scarce, a sociology graduate student at Washington State University, was not so fortunate. He spent 159 days in jail for contempt of court because he refused to share details of his research on radical animal rights organizations with a grand jury (Scarce, 1999). Roberto Gonzales, a sociologist at Harvard University, has to think about legal risks a lot; his work has involved interviewing young undocumented immigrants at risk of deportation (see "Conversations from the Front Lines").
(p. 85).

Carr, Deborah. The Art and Science of Social Research (First Edition) (p. 83). W. W. Norton & Company. Kindle Edition.

1. First, in some instances the ethical requirement, and often the legal requirement, is not to maintain confidentiality. Laws vary from state to state, but in many places the law requires researchers to alert the authorities if they learn about a child who is suffering abuse or if the subjects they are studying threaten to harm themselves or others. Thus, researchers cannot give subjects a consent form that promises complete confidentiality. Instead, they must tell research subjects that if they learn specific types of information in the course of the study, they are required to report it.

2. Second, unlike priests and lawyers, researchers do not have the legal right to refuse to cooperate with law enforcement in order to protect their research subjects. Suppose a researcher learns about illegal behavior during the course of a study. A court of law can subpoena that information, and law enforcement can seize it with a warrant.

3.Third, social scientists can be called to testify in criminal or civil cases. So, if researchers learn about the sale or use of illegal drugs, a person's status as an undocumented immigrant, or other illegal activities or behaviors, they risk contempt of court and jail time if they refuse the legal system's requests for the data. Steven Picou, a sociologist who studies disasters, did a study from 1989 to 1992 on the impact of the Exxon Valdez oil spill on small Alaskan villages. Lawyers for Exxon subpoenaed his files. Picou had to persuade the court to allow Exxon access only to files that had been used for publication purposes, and the court arranged for the files to be released only to an Exxon sociologist, whom the court prohibited from releasing data that would identify individuals. This arrangement allowed Picou to protect the people to whom he had promised confidentiality and to avoid being held in contempt of court and facing jail time or fines (Israel and Hay, 2006).

Carr, Deborah. The Art and Science of Social Research (First Edition) (p. 85). W. W. Norton & Company. Kindle Edition.

Which of the following principles is included in the American Sociological Association ASA code of ethics?

The five principles of ASA are: (1) professional competence, (2) integrity, (3) professional and scientific responsibility, (4) respect for people's rights, dignity, and diversity, and (5) social responsibility.

What are the six principles of the ASA code of ethics?

The six guiding principles enforce: 1) professional competence, 2) integrity, 3) professional and scientific responsibility, 4) respect for people's rights, dignity, and diversity, 5) social responsibility, and 6) human rights (ASA 2019).

How many ethical principles are included in the ASA code of ethics?

ASA's Code of Ethics consists of an Introduction, a Preamble, five General Principles, and specific Ethical Standards.

What are the principles of the code of ethics sociology?

There are six principles within the code of ethics. They are Professional Competence, Integrity, Professional and Scientific Responsibility, Respect for People's Rights, Dignity, and Diversity, Social Responsibility, and Human Rights.