Skip to main content
University of California Press

No Safe Place

Toxic Waste, Leukemia, and Community Action

by Phil Brown (Author), Edwin J. Mikkelsen (Author), Jonathan Harr (Foreword by), Phil Brown (Preface by)
Price: $29.95 / £25.00
Publication Date: Oct 1997
Edition: 1st Edition
Title Details:
Rights: World
Pages: 284
ISBN: 9780520212480
Trim Size: 6 x 9
Request an Exam or Desk Copy(opens in new window)RightsLink(opens in new window)

Read an Excerpt


Chapter 4. Taking Control: Popular Epidemiology


The public's knowledge of the Woburn problem stems solely from the residents' actions in discovering the leukemia cluster and pursuing the subsequent investigations. In researching this book, we read countless articles in newspapers, popular magazines, scientific journals, and health publications and followed television and radio coverage of the Woburn events. Uniformly, reporters and commentators view the Woburn citizens as the most powerful instance to date of a lay epidemiological approach to toxic wastes and disease. Although one of us (Phil Brown) some time ago coined the term "popular epidemiology" to describe Love Canal residents' organizing efforts, Woburn actually furnishes the first example of popular epidemiology strong enough to allow for the detailed formulation of the concept.

Traditional epidemiology studies the distribution of a disease or a physiological condition and the factors that influence this distribution. Those data are used to explain the causation of the condition and to point toward preventive public health and clinical practices. 1 In contrast, popular epidemiology is the process by which laypersons gather scientific data and other information and direct and marshal the knowledge and resources of experts to understand the epidemiology of disease. In some respects, popular epidemiology parallels scientific epidemiology, although they may proceed in different forms and tempos. In some cases, such as the discovery of Lyme Disease in the early 1980s, laypersons solved an epidemiological mystery before trained scientists. Despite similarities to traditional epidemiology, however, popular epidemiology is more than a matter of public participation in traditional epidemiology. Popular epidemiology goes further in emphasizing social structural factors as part of the causative chain of disease, in involving social movements, in utilizing political and judicial remedies, and in challenging basic assumptions of traditional epidemiology, risk assessment, and public health regulation. Still, the starting point for popular epidemiology is the search for rates and causes of disease.

We shall restrict the label of adherent or practitioner of popular epidemiology to residents who develop and apply the work of popular epidemiology in their communities. Sympathetic scientists may become supporters of popular epidemiology, but lay involvement in the discovery and pursuit of disease in such cases as Woburn is so significant that we shall apply the term only to laypersons. Adherents believe strongly that science, like government, must serve the needs of the people. Just as they question the political apparatus that typically discourages lay investigations into toxic hazards, so do they question the detached attitude of many in the scientific community who champion supposedly value-neutral scientific methods.

Popular epidemiology is a pursuit of truth and justice on behalf of the public that involves both laypersons and professionals. Popular epidemiology is not merely a system of folk beliefs, although they certainly deserve attention from professionals. Most centrally, popular epidemiology unites lay and scientific perspectives in an effort to link science and politics. 2

Although our discussion of popular epidemiology focuses on toxic waste contamination, the approach is valid for many other phenomena such as nuclear plants, pesticide spraying, and occupational disease. Popular epidemiology is an extremely significant advance for both public health and popular democratic participation.

Defining the Problem

The Quality of Lay Observation

Popular epidemiology is important for medicine and society because people often have access to data about themselves and their environment that are inaccessible to scientists. In fact, public knowledge of community toxic hazards in the last two decades has largely stemmed from the observations of ordinary people. Similarly, most cancer clusters in the workplace are detected by employees. 3

Even before observable health problems crop up, lay observations may bring to light a wealth of important data. Pittsfield, Massachusetts, residents knew before any authorities did about polychlorinated biphenyls (PCBs) that leaked from storage tanks at a General Electric power transformer plant and polluted the Housatonic River and the local groundwater. 4 Yellow Creek, Kentucky, residents were the first to notice fish kills, disappearances of small animals, and corrosion of screens and other materials. As one resident put it, in discussing a successful struggle to clean up a PCB site in Marlboro, New Jersey: "You didn't have to be a scientist. Trees were down, grass wasn't growing. You'd think you were on the moon or something." 5 Helene Brathwaite, leader of a struggle to remove asbestos from schools in Harlem, made a similar claim: "Nobody knew any more about this than I did. If you assume you're going to get experts to help you, you're in trouble. Most of the time on environmental issues there are no experts, and if there were we wouldn't have these problems." 6

This "street-wise or creek-side environmental monitoring" 7 occurred in Woburn, where residents noticed water stains on dishwashers and a bad odor long before they knew of adverse health effects. Love Canal residents remembered persistent bad odors, rocks that exploded when dropped or thrown, leakage of sludge into basements, chemical residues on the ground after rainfall, and irritations on children's feet from playing in fields where wastes were dumped. 8 Residents of South Brunswick, New Jersey, noticed foul-tasting water and saw barrels labeled "toxic chemicals" dumped, bulldozed, and ruptured. 9

Judith Broderick, from Reading (next to Woburn), noticed previously that she became ill for three months after exposure to chlorine leaks from a nearby factory. Later, she smelled the rotten-egg odor of hydrogen sulfide from decaying animal hides at a former glue factory. She remembers that "People felt nauseated. We had headaches, our eyes would burn, we had difficulty breathing, sleeping, eating." 10 Broderick knew that among the eight women of childbearing age on her block, there were six miscarriages and three stillbirths. At a nearby school, three of five pregnant teachers miscarried. When she looked out of her window she saw three special education buses coming to pick up children with various learning disabilities and handicaps. She thought, "Too many lost babies, too many damaged children." 11

Out of such observations, people develop "common sense epidemiology," 12 whereby they hypothesize that a higher than expected incidence of disease is due to pollution. In some cases, laypersons carry out their own study or initiate a study for experts to carry out. For example, in Pittsfield, a retired engineer was concerned about elevated cancer rates and known PCB contamination. He initiated a study which showed a high correlation between working for General Electric and having PCBs in the blood. Other residents then linked those blood levels with their knowledge of elevated cancer rates. 13 In Yellow Creek, Kentucky, a woman who helped organize a health survey remembered: "Every family told of kidney troubles, vomiting, diarrhea, rashes. One family showed us big welts right after they showered. And there were huge numbers of miscarriages. I cried every night. We gave our data to Vanderbilt University and they found high rates of these diseases. The Centers for Disease Control found some leukemia but said it wasn't statistically significant. Statistics don't tell you. People do. I've walked this creek and I've seen the sick people." 14

In 1973 a Michigan farmer, Rick Halbert, noticed that his cattle were becoming hunchbacked, bald, sterile, and crippled by overgrown hoofs before dying. He conjectured that those symptoms were caused by the cattle feed and carried out an experiment to test the idea. He fed twelve calves on that feed alone. Five died within six weeks, and most of the rest died during the next two months. Halbert reported these data to the Michigan Department of Agriculture, but they were not willing to repeat the experiment with cows. They gave the suspect feed to mice, all of whom died, but the supply company argued that the animals died from eating cattle feed instead of mice food. Halbert then hired scientists who, employing a mass spectrograph, found bromine in the feed. Eight months after Halbert's first observations, investigators learned that Michigan Chemical Corporation had accidentally supplied the Michigan Farm Bureau with sacks of fire-proofing chemical PBB, which is known to cause cancer, genetic mutation, and birth defects. During the crucial eight-month period between the farmer's first observations and the discovery of the accident, a great deal of contamination had already occurred. Human breast milk was found to contain PBB; many farm animals were poisoned too. Tens of thousands of livestock and millions of chickens were slaughtered as a result. 15

Another example of lay detection is offered by the dioxin contamination in Moscow Mills, Missouri, one of several Missouri dioxin sites besides the well-known Times Beach. In 1971, horse rancher Judy Piatt noticed a strong smell after a waste hauler sprayed road oil to keep dust down in the stable area. The next day she saw dying sparrows; in the following weeks cats and dogs lost hair, grew thin, and died. Forty-three of eighty-five horses in the exposed area died within one year, and of forty-one newborn horses, only one survived. Three months later Judy Piatt's daughter was hospitalized with internal bleeding. Based on her supposition that the waste oil was responsible, Piatt followed the route of the salvage oil dealer for over a year, noting sites where waste oil and chemicals were dumped. She sent her information to state and federal officials, but no action resulted. It was three more years until the CDC found dioxin in the oil, at 30,000 parts per billion; anything over one part per billion is considered dangerous. 16

Patricia Nonnon of the Bronx provides an additional illustration of creative case finding. When her nine-year-old daughter contracted leukemia, she remembered hearing of other cases; in fact, there were four in the three-block area bordering on the Pelham Bay dump. Prior complaints to the state environmental agency had brought no results, so Nonnon tried a different approach. She set up a telephone hot line in 1988 and received more than 300 calls reporting many diseases: twenty-five cases of childhood leukemia, sixty-one cases of multiple sclerosis, ten lupus, nine Hodgkin's disease, and six rare blood diseases. All respondents lived less than a mile from the dump. Residents knew the landfill was hazardous, because a few years before several firms had been convicted of illegally dumping hundreds of thousands of gallons of toxic waste over a decade at five landfills, including Pelham Bay. 17

In addition to collecting information, laypersons employ logical tests of the relationship between location and health. One Love Canal resident reported: "As far as the relationship of this to the chemicals, let me put it this way, when we go away from here, we feel fine. We just spent a month out west, no eye problems, no nerve problems, felt good. I slept like a log. We're back home, we have the same problems again. Headaches, eyes, nerves, not sleeping." 18 A number of other Love Canal residents reported changes in their family's health after their official relocation. 19

In Woburn, residents were the first to notice the leukemia cluster, through both formal and informal methods of identification. Then they framed a hypothesis linking pollution to disease and pressed local, state, and federal agencies to investigate the cluster. They particularly asked authorities to test the water that they suspected of being a cause. After state environmental officials found high concentrations of TCE and PCE in wells G and H, residents argued that those known carcinogens were the cause of the cluster. To bolster their hypothesis, Woburn residents joined with biostatisticians from the Harvard SPH to carry out the community health survey.

Without community involvement, this study would not have been possible because of the lack of money and personnel. The very fact of lay involvement led professionals and government to charge bias. Nevertheless, extensive analyses by the researchers demonstrated that the data were not biased, especially with regard to the use of community volunteers as interviewers. Resistance to the idea of lay participation is harmful, since professional and governmental distrust of the public can delay amelioration and cause additional disease and death.


The Myth of Value-Neutrality

Popular epidemiology opposes the widely held belief that epidemiology is a value-neutral scientific enterprise that can be conducted in a sociopolitical vacuum. It also challenges the belief that epidemiological work is properly conducted only by experts. Critics of the Harvard/FACE Woburn health study—among them the CDC, the American Cancer Society, the EPA, and even the SPH's Department of Epidemiology—argued that the study was biased by the use of volunteer interviewers and by prior political goals. The possibility of volunteer bias is a real concern, but on a deeper level the criticisms assumed a value-free science of epidemiology in which knowledge, theories, techniques, and actual and potential applications are themselves devoid of self-interest or bias.

As was the case in Woburn, popular epidemiology can include methodological and statistical controls for bias. Indeed, without skewing any evidence it can overcome some fundamental limitations of scientific endeavors. In practice science is limited by such factors as finances and personnel. Without popular participation it would be impossible to carry out much of the research needed to document health hazards. Science is also limited in its conceptualization of what problems are legitimate and how they should be addressed. As we have pointed out, physicians are largely untrained in environmental and occupational health matters, and even when they observe environmentally caused disease, they are unlikely to blame the disease on the environment. Similarly, epidemiologists and public health researchers are not sufficiently attuned to problems of toxic waste contamination. Funding agencies are reluctant to support the kinds of investigations needed at toxic waste sites. And, most fundamental, scientific approaches to toxic waste contamination are directed by an old paradigm that no longer fits reality.

Environmental health activists are by definition acting to correct problems not adequately addressed by the corporate, political, and scientific establishments. Popular involvement is usually necessary for professionals to target the appropriate questions, as is clear from the history of the women's health movement, 20 the occupational health and safety movement, 21 and the environmental health movement. 22 These movements have significantly advanced public health and safety by pointing out otherwise unidentified problems and showing how to approach them, by organizing to abolish the conditions that give rise to them, and by educating citizens, public agencies, health care providers, officials, and institutions. Popular participation brought to the national spotlight such phenomena as DES, Agent Orange, asbestos, pesticides, unnecessary hysterectomies, abuse of sterilization, black lung disease, and brown lung disease.


Issues of Scientific Method

Standards of Proof

Despite the successes of popular epidemiology, we must take a closer look at critics' concerns about breaches of scientific method. Does popular epidemiology adhere to the appropriate standards of proof? Authorities in fact disagree on the level of statistical significance required for intervention in environmental hazard settings. Many communities that believe they have uncovered environmental health risks find themselves challenged because they lack enough cases to achieve statistical significance. Some professionals who work with community organizations stick closely to accepted standards of statistical significance, 23 while others argue that such levels are as inappropriate to environmental risk as they are to other issues of public health and safety, such as bomb threats and possible epidemics. 24

We believe it is imperative to follow Ozonoff and Boden, who distinguish between statistical significance and public health significance. An increased rate of disease may be of great public health significance even if statistical probabilities are not reached. Further, clinical medicine tends to err on the safe side of false positives (claiming a relationship when there is none), and epidemiology should mirror clinical medicine rather than laboratory science. 25 Some researchers have noted that recent epidemiological research has a tendency to accept an increasingly lower level of false positives. 26

Traditional epidemiologists prefer false negatives (type II error) to false positives (type I error); that is, they would prefer falsely to deny an association between variables when there is one than to claim an association when there is none. 27 To achieve scientific statements of probability requires more evidence than is necessary to state that something should be done to eliminate or minimize a health threat. In the view of one observer,

The degree of risk to human health does not need to be at statistically significant levels to require political action. The degree of risk does have to be such that a reasonable person would avoid it. Consequently, the important political test is not the findings of epidemiologists on the probability of nonrandomness of an incidence of illness but the likelihood that a reasonable person, including members of the community of calculation [epidemiologists], would take up residence with the community at risk and drink from and bathe in water from the Yellow Creek area or buy a house along Love Canal. 28
Indeed, those are the kinds of questions presented to public health officials, researchers, and government members in every setting where there is dispute between the perceptions of citizens and officials.

For residents near toxic waste sites, proved toxicity is not required for alarm and action. Someone in Love Canal put it well:

I think the most important question that people ask, they always ask, "Well, how do the chemicals affect your family?" That really has nothing to do with it, because having two children, and living in that neighborhood, you had no choice. You had to get out of there whether the chemicals affected us or not. You cannot live a good, happy life always wondering. 29
In a further excursion into the politics of epidemiology, Beverly Paigen, a scientist with the New York State Department of Health who was instrumental in aiding the Love Canal residents, discusses a conversation with an epidemiologist from her office:

We both agreed that we should take the conservative approach only to find out that in every case we disagreed on what the conservative approach was. To him, "conservative" meant that we must be very cautious about concluding that Love Canal was an unsafe place to live. The evidence had to be compelling because substantial financial resources were needed to correct the problem. To me, "conservative" meant that we must be very cautious about concluding that Love Canal was a safe place to live. The evidence had to be compelling because the public health consequences of an error were considerable. 30
Paigen offers valuable insight into the scientist's choice between type I error (being more likely to accept false positives) and type II error (being more likely to reject false positives).

The degree to which one is willing to make one or the other kind of error is a value judgment and depends on what one perceives to be the consequences of making the error. To conclude that something is real when it is not means that a scientist has followed a false lead or published a paper that later turns out to be incorrect. This may be embarrassing and harmful to a scientist's reputation. In contrast, to ignore the existence of something real means that a scientist fails to make a discovery. This may be disappointing but it does not harm the scientist's reputation, so the scientist is more willing to make type II errors. However those charged with protecting public health and safety should be much more concerned about the second type of error, for a hypothesis that is not recognized drops out of sight. 31
Of her own scientific experience, she writes:

Before Love Canal, I also needed a 95 percent certainty before I was convinced of a result. But seeing this rigorously applied in a situation where the consequences of an error meant that pregnancies were resulting in miscarriages, stillbirths, and children with medical problems, I realized I was making a value judgment. 32
Paigen argues that the value judgment involves deciding "whether to make errors on the side of protecting human health or on the side of conserving state resources." The same logic applies to the choices between protecting public health and accepting "environmental blackmail" and the primacy of corporate development and profit. 33

Thus the competing paradigms of risk are not merely clinical and epidemiological, but also intensely political. On the one hand we have what David Dickson terms the "technocratic paradigm," in which the desire to protect the business community shapes regulation. In contrast, the "democratic paradigm" starts from the victims' perspective, values safety over profit, requires less than conclusive proof in order to take action, and provides those likely to be affected with an active voice in determining risk and making decisions. 34


How Good Are Official Data?

Even when citizens accept standard significance levels, they may suspect that the collection and analysis of official toxic hazard data are erroneous. Massive public complaints about Massachusetts' response to excess cancer rates in twenty communities (including Woburn) led to evaluations by the state senate and the University of Massachusetts Medical School, which found that the Massachusetts DPH studies of those excess rates were poorly conceived and methodologically weak. 35 The DPH studies were often unclear about what problem concerned the community or the DPH. Most studies had no adequate hypothesis, failed to mention potential exposure routes, and as a result rarely defined the geographic or temporal limits of the population at risk. Methods were presented inconsistently, statistical terminology was confused, case definitions were weak, and environmental data were rarely presented. Further, statistical tests were inappropriately used to explain away problems. In the case of elevated cancer in towns of Upper Cape Cod, initial analysis found no towns with excess rates. When a later analysis combined two adjacent towns, the data were compelling for the elevation of several cancer types, yet the DPH claimed that excess lung, colon, and rectal cancers had life-style rather than environmental causation. 36 Activists and sympathetic scientists find such arguments unacceptable; they view them as blaming the victims while denying possible external causes.

The Massachusetts commissioner of public health appointed a Study Commission on Environmental Health Issues in 1983 to report on problems such as these. The Study Commission's complaints mirrored those of the citizens:

Despite its public mandate, DPH is seen by such members of the community as an obstacle and an adversary, not as an agency that is helpful and sensitive to their needs. The Environmental Health Services Bureau routinely responds to a citizen concerned with local chemical contamination by minimizing the problem, stating, in effect, "We can't bother to investigate every unsubstantiated claim or concern," or, "When you can provide some hard scientific or statistical evidence, please call us again." 37
The commission found that despite its mandate to err on the conservative side of health protection, the DPH imagined the political risks to be too great and took the opposite tack to avoid antagonizing industry. 38

Massachusetts is a particularly interesting example because the damaging effects of the poor studies and nonresponsiveness to the community included the resignation of the public health commissioner. Elsewhere researchers often used exposed groups diluted by unexposed individuals and comparison groups that were inappropriate because they were likely to be exposed to the same health threats as exposed groups. 39 In the Michigan case of PBB-contaminated cattle feed, the state health department carried out a study so poorly designed that 70 percent of the control animals showed the presence of PBB. But the state would not admit the flaws in its work. 40

Lay individuals and groups trying to uncover and remedy environmental hazards have far fewer scientific and financial resources than do government units and are therefore at a scientific disadvantage. 41 Most communities that consider themselves at risk or as victims of environmental disasters have no stable source of scientific data. If they are lucky, they can mobilize local scientific support. But university-based scientists frequently consider applied community research to be outside the regular academic structure of challenge and reward. Often they see the work of uncovering environmental problems as fairly routine compared with work on frontiers of science such as molecular biology. 42 Furthermore universities have become increasingly dependent on corporations and government for support, and scientists have lost both autonomy and the urge to challenge established authority. 43

Scientists who ally themselves with citizen efforts are sometimes punished. When Beverly Paigen aided Love Canal residents in their health studies, she was harassed by her superiors. After Paigen spoke out publicly, the New York Department of Health withdrew a grant application she had written without telling her. They refused to process papers on another grant already funded, thus denying her the funds. She was told that because of the "sensitive nature" of her work, all grants and research ideas had to go through a special review process. Her professional mail was opened and taped shut, and her office files were searched after working hours. Paigen's state tax return was audited, and she saw in her file a clipping about her Love Canal work. Later, the state tax commissioner wrote her and apologized. Paigen was not the only scientist harrassed for siding with Love Canal residents. William Friedman, regional director of the Department of Environmental Conservation, and Donald McKenna, senior sanitary engineer in the regional office, were demoted and transferred, respectively, for raising questions about the state's investigation of Love Canal. 44

Similar cases of retaliation have been documented elsewhere. Melvin Reuben, director of the Experimental Pathology Laboratory at the Frederick Cancer Research Facility, Frederick, Maryland, was forced to resign for warning that malathion was carcinogenic; Irwin Billick, head of the Division of Environmental Research at the U.S. Department of Housing and Urban Development was fired for "unnecessary" work on lead poisoning. 45 In 1981, Peter Infante, a staff scientist for the Occupational Safety and Health Administration (OSHA), was threatened with dismissal for too energetically reporting the carcinogenicity of formaldehyde. 46

A cardinal assumption of scientific research is that the truth and validity of science are affirmed through open access to data, yet lay inquiry into environmental health risks is often obstructed by secret scientific data and analysis. Officials sometimes withhold information on the basis that it will alarm the public, that the public does not understand risks, or that it will harm the business climate. 47 The above-mentioned University of Massachusetts evaluation of DPH studies of excess cancer rates grew out of public pressure on the legislature. Citizens were angry that the health department did not seek the input of citizens, communicate data to affected towns, or share information when asked. Local health officials reported that they typically heard of elevated cancer rates through the media rather than from the DPH. Perhaps a good deal of the governmental and professional resistance to popular epidemiology derives from the fact that lay efforts very often point to the flaws and biases in official data that we have noted.

Lay investigations of environmental contaminants require full information. Although the right to know is usually associated with workers' right to know of toxic hazards in the workplace, the notion of a community right to know has developed recently. Community groups want all existing data to be available to them whether or not there are identifiable health effects. Although the agencies safeguarding the data defend secrecy on the ground that people will become alarmed, the people who request data from state health departments and cancer registries are clearly already alarmed. Another official excuse is that the media may make a story out of nothing. "Media hype" occurs throughout society, however, and is simply the price of democratic access to information. 48

Federal secrecy in investigations of the Woburn cluster violated residents' right to know. As mentioned previously, the EPA conducted a secret investigation of Woburn and thereby denied the public access to important scientific data. Past EPA administrator William Ruckelshaus formed the study group in 1984, but its existence was not discovered until May 1988. 49 That episode is reminiscent of the New York State Department of Health's response to Love Canal. Following a long train of events, in which the state did not release data to independent scientists, Governor Carey appointed a special commission chaired by biologist Lewis Thomas. Activists who invoked the New York Freedom of Information Act learned that the commission violated the law by not announcing meetings and not holding them publicly. 50 Indeed, knowledge makes a difference. One large population survey determined that the level of concern for environmental toxics rose with the number of information sources. 51 Another study found a positive correlation between information about the Diablo Canyon nuclear plant and the opposition to licensing it. 52

In addition, the federal government has dramatically affected the contours of environmental issues by loosening the acceptable levels of toxic hazards and regulatory agency enforcement. 53 The Reagan administration reduced controls on air and water pollution, stalled the recognition and prevention of acid rain, and reduced the regulatory power of EPA and OSHA. In December 1987 the Supreme Court limited the scope of the Clean Water Act when it ruled that citizens and environmental groups cannot sue companies for past violations. 54 Since people have long been denied access to necessary information, this decision is particularly unfair.


Government Resistance

In looking at official resistance to acknowledging toxic waste contamination, we cannot really se

About the Book

Toxic waste, contaminated water, cancer clusters—these phrases suggest deception and irresponsibility. But more significantly, they are watchwords for a growing struggle between communities, corporations, and government. In No Safe Place, sociologists, public policy professionals, and activists will learn how residents of Woburn, Massachusetts discovered a childhood leukemia cluster and eventually sued two corporate giants. Their story gives rise to questions important to any concerned citizen: What kind of government regulatory action can control pollution? Just how effective can the recent upsurge of popular participation in science and technology be? Phil Brown, a medical sociologist, and Edwin Mikkelsen, psychiatric consultant to the plaintiffs, look at the Woburn experience in light of similar cases, such as Love Canal, in order to show that toxic waste contamination reveals fundamental flaws in the corporate, governmental, and scientific spheres.

The authors strike a humane, constructive note amidst chilling odds, advocating extensive lay involvement based on the Woburn model of civic action. Finally, they propose a safe policy for toxic wastes and governmental/corporate responsibility. Woburn, the authors predict, will become a code word for environmental struggles.

About the Author

Phil Brown is Professor of Sociology at Brown University and Lecturer in Sociology, Harvard Medical School Department of Psychiatry. Edwin J. Mikkelsen is Director of the Division of Child Psychiatry at the Massachusetts Mental Health Center and Associate Professor, Harvard Medical School Department of Psychiatry.

Table of Contents

Foreward, by Jonathan Harr
Preface (1997)
Preface (1990)
Acknowledgments
Introduction
Town in Turmoil: History and Significance of the Woburn Cluster
The Formation of an Organized Community
The Sickness Caused by "Corporate America": Effects of the Woburn Cluster
Taking Control: Popular Epidemiology
Making It Safe: Securing Future Health
Bibliography

Reviews

"An excellent and readable account of the toxic waste crisis in Woburn, Massachusetts, and the courageous efforts by local citizens to protect their community. The Woburn story is an inspiring lesson for citizens across the country struggling to protect the environment from polluters and unresponsive government officials."—Senator Edward Kennedy