Sociologists at Work
The Sociologists at Work feature exposes students to the importance and application of social science research.
Ordinary People and Cruel Acts
If a being from another planet were to learn the history of human civilization, it would probably conclude that we are tremendously cruel, vicious, and evil creatures. From ethnic genocides to backwater lynchings to war crimes to school bullying, humans have always shown a powerful tendency to viciously turn on their fellow humans.
The curious thing is that people involved in such acts often show a profound capacity to deny responsibility for their behavior by pointing to the influence of others: “My friend made me do it” or “I was only following orders.” That leaves us with a very disturbing question: Can an ordinary, decent person be pressured by another to commit an act of extreme cruelty? Or, conversely, do cruel actions require inherently cruel people?
In a classic piece of social research, social psychologist Stanley Milgram (1974) set out to answer these questions. He wanted to know how far people would go in obeying the commands of an authority. He set up an experimental situation in which a subject, on orders from an authoritative figure, flips a switch, apparently sending a 450-volt shock to an innocent victim.
The subjects responded to an advertisement seeking participants in a study on memory and learning. On a specified day, each subject arrived at the laboratory and was introduced to a stern-looking experimenter (Milgram) wearing a white lab coat. The subject was also introduced to another person who, unknown to the subject, was actually an accomplice of the experimenter.
Each subject was told he or she would play the role of “teacher” in an experiment examining the effects of punishment on learning; the other person would play the role of the “learner.” The teacher was taken to a separate room that held an ominous-looking machine the researcher called a “shock generator.” The learner was seated in another room out of the sight of the teacher and was supposedly strapped to an electrode from the shock generator.
The teacher read a series of word pairs (e.g., blue–sky, nice–day, wild–duck) to the learner. After reading the entire list, the teacher read the first word of a pair (e.g., blue) and four alternatives for the second word (e.g., sky, ink, box, lamp). The learner had to select the correct alternative. Following directions from the experimenter, who was present in the room, the teacher flipped a switch and shocked the learner whenever he or she gave an incorrect answer. The shocks began at the lowest level, 15 volts, and increased with each subsequent incorrect answer all the way up to the 450-volt maximum.
As instructed, all the subjects shocked the learner for each incorrect response. (Remember, the learner was an accomplice of the experimenter and was not actually being shocked.) As the experiment proceeded and the shocks became stronger, the teacher could hear cries from the learner. Most of the teachers, believing they were inflicting serious injury, became visibly upset and wanted to stop. The experimenter, however, ordered them to continue—and many did. Despite the tortured reactions of the victim, 65% of the subjects complied with the experimenter’s demands and proceeded to the maximum, 450 volts.
Milgram repeated the study with a variety of subjects and even conducted it in different countries, including Germany and Australia. In each case, about two thirds of the subjects were willing, under orders from the experimenter, to shock to the limit. Milgram didn’t just show that people defer to authority from time to time. He showed just how powerful that tendency is (Blass, 2004). As we saw with the Rwandan genocide, given the “right” circumstances, ordinarily nice people can be compelled to do terrible things they wouldn’t have done otherwise.
Milgram’s research raises questions not only about why people would obey an unreasonable authority but also about what the rest of us think of those who do. A study of destructive obedience in the workplace—investigating actions such as dumping toxic waste in a river and manufacturing a defective automobile—found that the public is more likely to forgive those who are responsible when they are believed to be conforming to company policy or obeying the orders of a supervisor than when they are thought to be acting on their own (V. L. Hamilton & Sanders, 1995).
Milgram’s study has generated a tremendous amount of controversy. For over four decades, this pivotal piece of research has been replicated, discussed, and debated by social scientists (Burger, 2009). It has made its way into popular culture, turning up in novels, plays, films, and songs (Blass, 2004). Since the original study, other researchers have found that in small groups, people sometimes collectively rebel against what they perceive to be unjust authority (Gamson, Fireman, & Rytina, 1982). Nevertheless, Milgram’s findings are discomforting. It would be much easier to conclude that the acts of inhumanity we read about in our daily newspapers (such as soldiers raping civilians or killing unarmed noncombatants) are the products of defective or inherently evil individuals—a few “bad apples.” All society would have to do then is identify, capture, and separate these psychopaths from the rest of us. But if Milgram is right—if most of us could become evil given the “right” combination of situational circumstances—then the only thing that distinguishes us from evildoers is our good fortune and our social environment.
The Economics and Politics of Food
Institutional influence is sometimes not so obvious. For instance, we usually think of nutrition as an inherent property of the foods we eat. Either something is good for us or it’s not good for us, right? And we trust that the nutritional value of certain foods emerges from scientific discovery. We rarely consider the economic and political role that food companies play in shaping our tastes and our dietary standards (Pollan, 2007).
Marion Nestle (2002), a professor of nutrition and food studies, wanted to examine the institutional underpinnings of our ideas about health and nutrition. She faced an interesting data-gathering dilemma, however. No one involved in the food industry was willing to talk to her “on the record.” So she compiled information from government reports, newspapers, magazines, speeches, advocacy materials, conference exhibits, and supermarkets. She also used information that she’d previously received from lobbying groups and trade associations representing diverse interests such as the salt, sugar, vitamin, wheat, soybean, flaxseed, and blueberry industries.
Despite alarming levels of food hunger among the world’s population (see Chapter 10), the United States has so much food that we could feed our citizens twice over. Many Americans regularly buy or prepare more food than they actually need (hence, the popularity of “doggie bags” and leftovers). The food industry is therefore highly competitive. But like all major industries, companies are beholden to their stockholders rather than to the consuming public. Marketing foods that are healthy and nutritious is a company’s goal only if it can increase sales.
Food marketers have long identified children as their most attractive targets. According to Nestle, the attention paid to children has escalated in recent years because of their increasing responsibility for purchasing decisions. Children between 6 and 19 are estimated to influence upward of $500 billion in food purchases each year (cited in Nestle, 2002). By age seven, most children can shop independently, ask for information about what they want, and show off their purchases to other children.
Soft drink companies have become especially adept at targeting young people with diverse marketing strategies. Soft drinks have replaced milk as the primary beverage in the diets of American children as well as adults (Nestle, 2002). Vending machines, which tend to be stocked with high-calorie soft drinks and sports drinks as well as other “junk” foods, exist in 17% of elementary schools, 82% of middle schools, and 97% of high schools (cited in Kalb, 2010). Nearly three-quarters of students who use campus vending machines buy sugar-sweetened beverages (Wiecha, Finkelstein, Troped, Fragala, & Peterson, 2006). The typical American teenage boy gets about 9% of his daily caloric intake from soft drinks, and about 20% of one- and two-year-olds regularly drink soda (Schlosser, 2001).
One of the most controversial marketing strategies in the soft drink industry is the “pouring rights” agreement, in which a company buys the exclusive right to sell its products in all schools in a particular district. For instance, Coca-Cola paid the Rockford, Illinois, school district $4 million up front and an additional $350,000 a year for the next 10 years to sell its beverages in the schools (cited in Philpott, 2012). In financially strapped districts, a pouring rights contract often supplies a significant part of the district’s annual funding. It may be the only thing that allows a school system to buy much-needed resources like computers and textbooks. It’s estimated that about 80% of American public schools have pouring rights contracts with either Coca Cola or Pepsi (Philpott, 2012). And it’s not just in schools. Coca Cola and Pepsi continue to compete against each other for multi-million dollar pouring rights contracts at youth sport complexes…and even at the Little League World Series (Cook, 2013).
Besides the lump sum agreed to in the contract, companies frequently offer school districts cash bonuses if they exceed certain sales targets. Hence, it is in the district’s financial interest to encourage students to consume more soft drinks. In light of such incentives, ethical implications and health concerns become secondary. Indeed, many school districts justify these agreements by saying that soft drinks pervade the culture and students will drink them anyway, so why not get some benefit?
In addition to the long-term health effects of heavy soft drink consumption, however, Nestle points out that students learn a somewhat cynical lesson: that school officials are sometimes willing to compromise nutritional principles (and the students’ physical well-being) for financial gain. Pouring rights contracts can also have a serious impact on long-term school funding. While they may solve short-term financial needs, they may also hamper efforts to secure adequate federal, state, and local funding for public education. Taxpayers may come to the conclusion that raising taxes to support public schools is unnecessary if the bulk of a district’s operating budget comes from these commercial contracts.
In 2014, the United States Department of Agriculture released nutritional guidelines for snack foods sold in schools. The guidelines set minimum requirements for calories and fats allowed, encourages schools to offer low-fat and whole-grain snack foods, and limits the availability of sugary drinks. They don’t, however, apply to after-school sporting events or fundraisers, where candy and soft drinks can still be sold (Nixon, 2013). In 2015, the Food and Drug Administration began taking steps to remove artificial trans fat from processed foods and General Mills removed artificial colors and flavors from its breakfast cereals.
No matter what the outcome of these actions, soft drink and food companies will continue to play a significant role in school district budgets. In this role, we can see how a child’s food choices in school are linked deeply and profoundly to broader educational, political, and economic needs—often with less attention paid to nutritional considerations and individual health.