What is a form of biological impairment identified and explained within the discourse of biomedicine?

Medicalization: Sociological and Anthropological Perspectives

Peter Conrad, Meredith Bergey, in International Encyclopedia of the Social & Behavioral Sciences (Second Edition), 2015

Abstract

Medicalization is the process by which nonmedical problems become defined and treated as medical problems often requiring medical treatment. The term medicalization first appeared in the sociology literature and focused on deviance, but it soon expanded to examine other human conditions. This article focuses on sociological and anthropological perspectives on the expansion of medicalization. This includes a review of the characteristics, origins, and consequences of medicalization, with sociology's emphasis on the emergence of medical categories and anthropology's emphasis on biomedicine's institutional power and cultural authority. Medicalization has numerous social consequences, including the pathologization of human differences and individualization of human problems while minimizing social and political context.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780080970868640205

Gender and Health Care

R.R. Anspach, in International Encyclopedia of the Social & Behavioral Sciences, 2001

4.2 Medicalization and Medical Technology

Medicalization is often accompanied by the growth of medical technologies. In fact, medicalization is encouraged by the drug companies, who have a stake in promulgating the ideas that pregnancy, PMS, and menopause are diseases requiring treatment.

Early studies of estrogen replacement therapy, DES, and the Dalkon shield focused on their rapid diffusion into medical practice before their consequences were known and the drug companies' delay in recalling these technologies even after their risks had become apparent. (see Riessman 1983). This pattern, however, has occurred with other medical technologies unrelated to women's health.

More recent research has focused on the social consequences of new reproductive technologies, such as genetic screening, in vitro fertilization, and fetal surgery healthcare. Some new reproductive technologies have diminished women's control in decision making, while augmenting the power of professionals, social movements or the state. For example, the use of implantable contraceptives by the poor or by women in developing nations raises the spectre of a powerful group or society controlling the fertility of the less powerful (see Ruzek et al. 1997). New diagnostic technologies diminish the importance of women's experience of their bodies and force them to depend on experts to interpret the new technologies. With the development of ultrasound, for example, women became less reliant on their own experience of fetal movements and more reliant upon experts to help them ‘see’ the fetus in an otherwise obscure sonogram (Duden 1993).

Some commentators fear the potential of some new medical technologies to devalue the mother while increasing the social value of the fetus. For example, the new specialty of fetal surgery results in the creation of an unborn patient—the fetus—while the mother becomes peripheral to the treatment process (Casper 1998). This trend has culminated in the fetal rights movement, in which African–American mothers accused of using drugs have been prosecuted for allegedly endangering fetal health or even forced to undergo Caesarian sections. In these cases, women lose control of reproduction, which has been taken over by the state.

Most critics of new medical technologies do not simply assume that all medical technologies are intrinsically harmful to women. Rather, most take the more nuanced view that the consequences of any technology depend largely on the social context in which it is deployed.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B0080430767038596

Pharmaceutical Industry: Political Economies of Drugs and Knowledge

Sergio Sismondo, in International Encyclopedia of the Social & Behavioral Sciences (Second Edition), 2015

Illness: From Medicalization to Pharmaceuticalization

Medicalization became a major issue in the 1960s, as critics railed against the authority of physicians and the medical profession as a whole to take control over intimate events and processes, often simply through classification. Criticism of medicalization was especially prominent in challenges to psychiatry by scholars such as Michel Foucault and Thomas Szasz. Their work was later picked up and applied in different realms, especially by feminist scholars. As a result, there have been challenges especially to mental health diagnostic categories ranging from schizophrenia to anxiety and depression, and challenges to the medicalization of ordinary life events and stages, such as childbirth and menopause.

The focus on professions has lost some of its traction, and physicians today appear to be only one set of actors in struggles for control over bodies, health, and illness. Much medicalization, then, looks as though it is merely grease for the wheels of ‘pharmaceuticalization’ (e.g., Williams et al., 2011). This claim is made even more plausible by the fact that some of the standard objects of critics' attention, such as anxiety, depression, and menopause, are closely associated with new classes of drugs. To increase their sales, pharmaceutical companies engage in ‘selling sickness’ or ‘disease-mongering’: expanding awareness of diseases for which their drugs can be prescribed, and increasing the likelihood that people will see themselves as having those diseases. Moreover, at least some pharmaceuticalization is independent of medicalization, when people use drugs as remedies for difficulties that they do not necessarily see in medical terms, such as excess weight, sleepiness, lack of focus, and shyness (Williams et al., 2011).

The category of depression is widely understood as having been affected by the availability of drugs. Before the 1980s, depression was a relatively uncommon diagnosis, and was generally associated with the elderly (Healy, 2004). Beginning with the arrival on the market of Eli Lilly's drug Prozac in 1987, however, ever-increasing numbers of people have been diagnosed with depression and are disabled by depression (Whitaker, 2010). Diagnostic criteria for depression have continually broadened, and thus estimates of the prevalence of depression have risen dramatically. It is now the ‘common cold’ of mental disorders.

Prozac was the first widely prescribed member of the Selective Serotonin Reuptake Inhibitor (SSRI) family of drugs, now often called simply ‘antidepressants.’ Companies selling SSRIs have marketed both the drug and the disease. They have invested heavily in research on depression and antidepressants. They have promoted a serotonin-deficiency theory of depression, for which there is little evidence (Whitaker, 2010). They have established close connections with psychiatrists and other physicians who write textbooks, articles, and clinical practice guidelines (Healy, 2004). Through such means and others, the companies appear to have successfully established the disease both medically and culturally, helping physicians see it often and helping patients interpret their feelings and experiences in terms of it – perhaps even shaping their identities around it. The World Health Organization predicts that within 20 years more people will be afflicted with depression than with any other health problem.

Depression may seem like a special case, because it is a mental illness and because the boundaries between the disorder and sadness are malleable. However, many examples of somatic or bodily illness have been shown to have been strongly affected by marketing efforts, including such common chronic diseases as hypertension, diabetes, high cholesterol, and osteoporosis (Greene, 2007; Moynihan and Cassels, 2005). To take one of these examples, when the pharmaceutical company then known as Merck Sharp & Dohme introduced its anti-hypertensive drug Diuril in 1957, high blood pressure was a sign associated with an underlying disease, but was not itself generally viewed as a problem to be controlled. Even the definition of high blood pressure was a difficult matter, since blood pressure had a Gaussian distribution, and any dividing line between high and normal seemed arbitrary. Diuril dramatically lowered blood pressure, apparently only in people with elevated levels, suggesting that it might address root causes of the problem. In conjunction with an unprecedentedly successful marketing campaign, its effects established hypertension as a disease in itself (Greene, 2007). The mildness of side effects of diuretic drugs made it straightforward for companies and their agents to argue for their use to treat ever-broader populations with ever-lower blood pressures. Recommendations followed suit, and blood pressures that were once judged as within a normal range became high. Thus, people came to understand themselves as hypertensive or as disposed to hypertension; in at least one population, hypertensive African-Americans, the condition has become linked to racial identity (Pollock, 2012). The pharmaceuticalization process has continued as new and different antihypertensive drugs have been found and marketed.

Chronic conditions such as hypertension neatly fit a new model of health and illness. Whereas people once generally considered themselves healthy unless they felt ill or had visible symptoms, some changes have resulted in a new model of health (Dumit, 2012). The past half-century has seen the rise of risk factors: familiar ones like diet, age, and sleep patterns, and unseen and less familiar ones like cholesterol levels and high blood pressure. As a result, we are all at risk, differing only in our degrees of risk. Partly following from the rise of risk factors, we can be normal and unhealthy at the same time, at least when there is some hope for treatment. For example, the results of aging used to be simply unfortunate, but now we look for medical means to stave them off or treat them. There is no contradiction in the thought that people are unhealthy in one or another respect.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780080970868850540

Anxiety, EEG patterns, and neurofeedback

Jane Price M.A., Thomas Budzynski Ph.D., in Introduction to Quantitative EEG and Neurofeedback (Second Edition), 2009

II The Anxiety State

The medicalization of anxiety states has undoubtedly contributed to the search for health care, given that the general attitude is that to be in an uncomfortable emotional state is to be mentally disordered. Webster’s College Dictionary, 4th edition (2000) defines anxiety as:

1.

“A state of being uneasy, apprehensive or worried about what may happen; concern about a possible future event”; and

2.

(Psychiatry) “An abnormal state like this, characterized by a feeling of being powerless and unable to cope with threatening events (typically imaginary) and by physical tension, as shown by sweating, trembling, etc.”

A question is, should psychiatry/psychology discriminate between the expressions of fear and anxiety? For example, Cromer (2004) notes that fear is the state of alarm elicited by a perceived serious threat to one’s well-being, while anxiety more often refers to a vague sense of being in danger, with inability to pinpoint the cause. The distinction between the layman’s view and psychiatry’s emphasis is that the anxiety disorders described in the DSM-IV possess an element of exaggerated fear. Common symptoms in DSM-IV also include chronic worry, restlessness, muscle tension, sleep disturbance, and subjective feelings of distress. Psychiatry would say that while most, if not all, individuals occasionally experience this distress, to be diagnosed with an anxiety disorder, symptoms must be abnormally severe, disabling, frequent, easily triggered and/or long lasting. How easy it is to slip into a diagnosed status. The therapist or anti-anxiety drugs often are substitutes for our family network, our community catchment, and our churches.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123745347000174

Max Hopwood, Carla Treloar, in Interventions for Addiction, 2013

A Sociological Analysis of Medical Harm Reduction

The medicalization of harm reduction is a product of the times; medical harm reduction is a movement in step with globalization, a phenomenon that has led to significant developments in the functioning of capitalism. The emergence of the medical harm reduction movement is part of an overall shift away from addressing social problems through the coercive power of the state (i.e. via law enforcement and military) toward the role of individuals' governance of self and others. Neoliberal values of individualism and self-regulation are salient in the language and interventions of medical harm reduction. Today, harm reduction is a regime of (self) governance, which is predicated upon a scientific calculation of risk based on epidemiological analysis. Epidemiological evidence informs statistical models of risk practice and risk communities with an aim of developing harm reduction interventions that encourage and normalize self-regulation. An example of this is the “injecting drug use community,” a geographically and demographically unbounded collective that share a statistically determined susceptibility at a population level to poor health outcomes like HIV infection. The medical harmreduction movement has become critically important for public health through teaching “community” self-management risk-reduction interventions, like using sterile equipment for every injection.

Within a neoliberal framework, the onus is on illicit drug-using individuals to negotiate risk, much of which is generated by the state via prohibitive drug laws and the operations of police. People from lower socioeconomic circumstances are disproportionately affected by drug-related harm relative to those from the middle classes. This is partly due to having less access to healthier but more expensive options and poorer uptake and compliance with modes of self-regulation. Medical harm reduction ignores the constraints against “choosing” healthy practices and lifestyles that many people experience. Individualism and the ideology of personal responsibility can undermine the social effort to improve health for example via community-based capacity building programs that use the arts, sporting, cultural, education, and similar programs to build resilience and “protect” people and communities from anomie, alienation, boredom, and dislocation, which often lead to problematic drug use. An overemphasis on personal responsibility also increases the likelihood of emerging health-related stigma. Individuals or groups of people whose lifestyle practices are deemed to be a personal or community health risk, like people with HIV or viral hepatitis, are held responsible for their own health problems and often viewed as a drain on resources and a threat to others.

Medical harm reduction is also criticized for lacking a coherent theoretical framework, which among other problems makes it difficult to accurately define. Commentators claim that to become a unified paradigm, harm reduction must: (1) maintain its successful programs in injecting-related HIV prevention and (2) foreground human rights to challenge entrenched power, double standards around drug use, and abuses against all people who use any illicit drug. This will be challenging as the value of harm reduction has long been measured by its capacity to prevent transmission of HIV infection, and social and political analyses highlight how the cost-benefit analysis of HIV prevention for governments and wider societies underpins harm reduction's growing international acceptance. By focusing attention on interventions that halt the spread of HIV, harm reduction programs are acceptable enough to politicians, religious leaders, and members of the wider community. On the other hand, advocating for illicit drug users' human rights and drug law reform leaves harm reduction politically vulnerable to opposition from powerful conservative groups and a public that has been socialized to see illicit drug users as criminal deviants.

A reluctance to engage with broader social issues has led some commentators to characterize medical harm reduction as a conservative movement that is dominated by health professionals, academics, and bureaucrats. For the most part, medical harm reductionists are comfortable attending to a set of health risks for people who inject that are largely created by the systems under which we all live and work. In this way, the medical model of harm reduction enables states to continue harming illicit drug users without taking responsibility for state inaction on prohibition, poverty, stigma, and social marginalization. The voice of illicit drug users is silenced by biomedical imperatives regarding population health. Indeed, the success of HIV prevention strategies has reduced the incentive to address more socially and politically difficult dynamics like economic marginalization and punitive illicit drug policies, which continue to damage large numbers of people who use illicit drugs.

The transition in harm reduction over the past 30 years from an activist movement to a medical model has prompted renewed interest in advocacy in the new millennium. Prior to the HIV epidemic, harm reduction was imbued with a strong communitarian ethos. It was a movement that represented the interests of people who used any illicit drugs, not just those who injected opioids at risk of HIV infection. Harm reduction had an expansive agenda to advocate for drug law reform, to protect the dignity and human rights of people who use illicit drugs, to tackle poverty, and to work toward finding long-term solutions to the world's illicit drug problems. Against this background, the medical model of harm reduction represents a compromise; a band-aid solution that very effectively addresses the immediate risks associated with injecting drug use but has nothing to offer in the way of effecting significant improvements beyond the prevention of physical disorders, as important as that is. People involved in harm reduction advocacy today demand wider social and political change and believe that medical harm reduction must become engaged with ethics in order to optimize health-related outcomes. A reinvigoration of political activism within harm reduction is drawing attention to the impacts of poverty and marginalization on health-related decision-making, and it is encouraging to note that protection of illicit drug users' human rights has begun to emerge in the discourse of medical harm reduction. Harm Reduction International recently highlighted issues such as the systemic torture of people who were forced to attend drug detention centers in countries like China and Cambodia. This is seen as a welcome development, but a small step on a long path ahead.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123983381000750

End of Life Choices

Frances Norwood, in International Encyclopedia of the Social & Behavioral Sciences (Second Edition), 2015

Future Directions and Alternative Choices at the End of Life

While medicalization of the end of life may dominate, there are other alternative choices that are working to transform how people around the world experience the end of life. Some would argue that less medical attention and more caring for the social person is what is called for at the end of life and there are a number of nonmedical approaches to dying that are gaining momentum. Palliative care and hospice are on the rise in the US, Europe, and in other places around the world and are beginning to offer a real alternative to medicalized ways of dying. Palliative care, also called comfort care, varies extensively from place to place (Broom et al., 2012; Bruera et al., 2000; Hendry et al., 2013; Higginson et al., 2003; Kellehear, 2000; Lawton, 2000). In some places, palliative care may simply be a retooling of medical care to focus on relieving symptoms and pain management just in the final hours and days of life once all curative measures have ceased; in other places, however, it encompasses something broader – prolonged practices that strive to address the needs of the whole person, including spiritual, existential, emotional, as well as physical needs at and near the end of life. Palliative care is limited, however, in places like the US where it is largely a retooling of medical care, and hospice is equated with the final hours and days of dying. When this occurs, palliative care and hospice are accessed too late in the life course to offer a real alternative to medicalized dying. In places like the Netherlands, however, where whole-patient tenets of palliative care were never extricated from what primary care physicians do, broader forms of caring within the medical establishment exist, offering Dutch people of all ages access to more holistic care throughout the life course (Norwood, 2009: 93–94).

A resurgence of hospital chaplains and spirituality in health care in recent times offers another alternative to the medical pathway at the end of life (Aldridge, 2000; Cadge, 2013; Kellehear, 2000; Puchalski, 2006; Young and Koopsen, 2010). The modern-day chaplains are immersed in a world of hospital medicine, but at the margins they have been able to reestablish a place at the bedside of the sick and dying. In an ethnographic study of hospital chaplaincy in 1997, I found chaplains who were able to forge a strategic, if ambivalent, existence within hospital medicine by alternately embracing and distancing themselves from competing discourses of religion and medicine in order to offer patients what medicine cannot. They bear witness; they bring skin-to-skin touch and comfort through religious ritual at a time when medical treatments are no longer viable. Processing what it means to die, once largely the domain of priests, chaplains, and religion, now rests in many hospitals in the hands of all-faith chaplains who (when skilled) listen without proselytizing and witness without imposing their own agenda (Norwood, 2006a).

Finally, home- and community-based supports for persons who are aging and living with chronic illness and decline are on the rise in Europe and in the US. In the United States, for example, new supports are emerging boosted in part by the passage of the Patient Protection and Affordable Care Act of 2010. These supports are designed to keep more people at home or in community-based settings and are designed increasingly around holistic, person-directed models of long-term care (Gleckman, 2009; SAL, 2001; Thomas, 1996). In places around the US, for instance, there are a number of grassroots organizations designed to help keep people as they age in their own home or community. The Village movement is a community-driven initiative where neighborhoods of persons who are aging band together, paying annual membership fees to support an executive director and office space. Members then have only to request assistance – a ride to the grocery store, assistance picking up prescriptions, or help accessing home health services or supports – and organized volunteers from the community provide these services (Gleckman, 2009: 122–124). When persons can no longer live at home on their own, assisted living has emerged in places around Europe and the US to offer a very different model of residential care compared to the more traditional medical model nursing home. Pioneered in the US by Paul and Terry Klaassen, founders of Sunrise Senior Living and expanded by Bill Thomas through the Eden Alternative and the Green House Project, not all assisted living are created alike. Assisted living can vary from small, eight-person group homes to large facilities, but most are predicated on a founding philosophy exemplified by the Eden Alternative that emphasizes ‘person-centered’ care, choice, companionship, and dignity over the more traditional medical-model of long-term care (Thomas, 1996: 9). Unfortunately in the US, not everyone has equal access to these home- and community-based alternatives. Home health services are not utilized enough and government-subsidized long-term care for the indigent, Medicaid, cannot currently be used to pay for most assisted living so it remains for the time being for those who can afford to pay.

As people age, they need supports that will help them maintain not only their physical body, but their social being. In a recent study of innovations in long-term care, I found that the ability of persons to participate in reciprocating relationships through the exchange of ‘gifts’ is a critical part of maintaining social bonds and social life in long-term care. Using anthropologist Marcel Mauss' concept of gift exchange (1967), I found that long-term care environments that foster self-determination, social connection, and reciprocity in relationships help residents maintain a good quality of social life and alternately prevent premature social death – or death of the social being. Even giving the simplest of gifts can foster social connection – from participating in cutting vegetables for dinner, to helping a fellow resident cross a room, to listening to someone tell a story of their life. It is these types of ‘gifts’ that help people maintain social life and make long-term care settings more like home, particularly near the end of life (Norwood, forthcoming).

For those who cannot (or do not wish to) avoid medicalized ways of dying, choices are also being transformed within medicine. Around the world, health systems are feeling the pressure as more baby boomers age and live longer with chronic conditions. With the increasing cost of end-of-life care – some of the costliest care persons receive (Lynn, 2004: 1–10) – policy makers and other key stakeholders are being forced to make difficult choices in the provision of health and long-term care. In the US, for example, the Patient Protection and Affordable Care Act, or Obamacare as it is popularly named, brings with it a culture of medical change as insurers, hospitals, clinics, and providers around the country are called on under a number of initiatives to improve health care – improving access to health care, improving health outcomes, improving quality of care, while reducing overlap and overall cost of care. While the full impact of this legislation is not clear, what is clear is that medical practices in the US are changing as a result. At the end of life, this is likely to mean shifts toward home- and community-based solutions and a growing shift away from institutionalized care, such as the traditional nursing home model.

Natural process and cultural intervention pervade end-of-life choices and nature and culture really cannot exist without the other. End-of-life choices around the world exist, but largely exist among a culturally restricted set of options. This is what causes people like Mr Carter and the many others like him to die deaths that are not what they or their family envisioned and worse – hooked up to machines, dying in hospitals, experiencing CPR, and other aggressive forms of intervention in the final days of what they had hoped might be a more peaceful passing. That death – or at least ordinary, everyday dying – is denied to some extent in many places in the modern world is probably true. It is what leaves people like Mr Carter without a medically or culturally supported place to contemplate how he ideally wanted to live prior to his death. It left him with few people to talk to, with little time to process, and without a clear understanding of his medically constructed options. The result for many who die similar deaths around the world is a death without real choices.

Sociologist William Bogard may be right – that choice at the end of life is in part an illusion, and instead how many people in highly industrialized societies die is largely a factor of medical discourses exerting social control to modulate the course of modern life (Bogard, 2008). We die these highly medicalized deaths because that is the current dominant discourse on dying. In many Western industrialized nations our choices are medically mediated choices, offering dying persons and their families mostly institutionalized, aggressive, futile, and painful options for sustaining dying bodies often without regard to the physical, emotional, spiritual, and social suffering that is caused. Death may be a natural event, however treatment of death and dying and the choices that flow from that are largely cultural. Still, at the borders of dominant discourses, human agency and competing discourses are forging alternative pathways. While choice may always be somewhat restricted by cultural mediation, new choices are emerging and competing with dominant discourses to alter and change the practices and meanings of how people around the world die.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780080970868640941

Medicalization: Cultural Concerns

M. Lock, in International Encyclopedia of the Social & Behavioral Sciences, 2001

3 Medicalized Identities

Social science critiques of medicalization, whether associated more closely with labeling theory and the social control of deviance, or with Foucaldian theory and the relationship of power to knowledge, have documented the way in which identities and subjectivity are shaped through this process. When individuals are publicly labeled as schizophrenic, anorexic, menopausal, a heart transplant, a trauma victim, and so on, transformations in their subjectivity are readily apparent. At times medicalization may function to exculpate individuals from responsibility for being sick and thus unable to function effectively in society. Medicalization is not limited to sickness and ‘deviance’ however. Wellness—the avoidance of disease and illness, and the ‘improvement’ of health—is today a widespread ‘virtue,’ notably among the middle classes.

As part of modernity, practices designed to support individual health have been actively promoted for over a century, and are now widely followed among the population at large. The sight of jogging wives of fishermen in Newfoundland's most isolated communities is testimony to this. The individual body, separated from mind and society, is ‘managed’ according to criteria elaborated in the biomedical sciences, and this activity becomes one form of self-expression. Body aesthetics are clearly the prime goal of some individuals, but a worry about the ‘risk’ of becoming sick is at work for the majority. By taking personal responsibility for health, individuals display a desire for autonomy and in so doing they actively cooperate in the creation of ‘normal,’ healthy, citizens, thus validating the dominant moral order (Crawford 1984). Health is commoditized.

As evidence is amassed to demonstrate conclusively how social inequity and discrimination of various kinds contribute massively to disease, ranging from infections to cancer, the idea of health as virtue appears increasingly out of place. Due to poverty large segments of the population in most countries of the world have shorter life expectancies and a greater burden of ill-health than do their compatriots. The pervasive value of individual responsibility for health enables governments to narrow their interests to economic development, while ignoring redistribution of wealth and the associated cost for those individuals who, no matter how virtuous they may be, are unable to thrive.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B0080430767046192

Medicalization or Medicine as Culture? The Case of Attention Deficit Hyperactivity Disorder

Helen Keane, in When Culture Impacts Health, 2013

Medicine as Culture

In contrast to understandings of medicalization, which see medicine as encroaching into the realm of everyday life, sociologist Nikolas Rose argued that since the end of the eighteenth century medicine has played a formative role in producing our everyday experiences of life. Under the “regime of the self” produced by modern medicine, we understand ourselves in terms of health or normality, a form of medical thinking that exists independent of any specific episode of illness or diagnosis of disease (Rose, 2006, 2007a). By insisting that medical expertise and knowledge was one of the constitutive elements that formed the modern idea of “the social,” Rose (1994) undermines claims about the expansion of medical discourse into non-medical problems.

In his recent work on biopolitics in the twenty-first century, Rose (2007b) provides an updated account of the relationship between medicine and the social, highlighting the shift to a molecular style of thought and a new engagement with technologies of optimization. Rose argued that as a result of advances in fields such as genetics, genomics, neuroscience, and biotechnology, novel forms of citizenship have emerged in which individuals understand their selfhood, their aspirations, and their rights and responsibilities in biological terms. As Rose (2007b, p.25) stated “this is an ethic in which the maximization of lifestyle, potential, health and quality of life become almost obligatory, and where negative judgments are directed toward those who will not, for whatever reason, adopt an active, informed, positive, and prudent relationship to the future.” At the same time, a new global bio-economy has developed in which health, disease, and other vital processes become sites of entrepreneurship, capital investment, and sources of wealth. In the resulting medical marketplace, pharma and biotech companies offer new technologies of selfhood that promise to help individuals enhance their capacities and achieve their aspirations as well as treat their illnesses (Rose, 2007b).

The discourses and practices that constitute ADHD are a vivid example of the landscape Rose outlined, a landscape in which individuals’ hopes, anxieties, and discontents become expressed in medical and psychiatric terms. In the dominant medical theory, ADHD is the result of brain dysfunction, specifically an impairment in executive function (Barkley, 1997). The ascription of neurological difference as the core of the condition shapes the responses of doctors, educators, parents, and children. Psychopharmaceutical treatments are more readily accepted if they are seen as remedying specific deficits in brain functioning rather than simply modifying difficult behavior, but their wide adoption is not solely attributable to the authority of the disease model of ADHD. Stimulant medications, like many other pharmaceutical products, have become enrolled in the culturally normative project of active and responsible self-invention. For parents, this project includes taking charge of their child’s future as well as their present, by realizing his or her potential and optimizing his or her chances of success. Psychopharmaceutical drugs, including stimulant medications, have become meaningful and valuable because of their compatibility with these hopes for the future (Rose, 2006). As Miller and Leger (2003, p. 29) stated, Ritalin can be seen as “one more cosmopolitan investment in human capital, in a risk society that wagers its future on the very people about whom it most panics.”

Similarly, the constitution of ADHD as a brain-based learning disability has multiple meanings and effects. In the school setting, the diagnosis can allow for teaching and assessment techniques to be modified according to the perceived needs of the individual child (Mayes et al., 2009). Thus while the medical model places the sufferer in a category in which defines his or her identity, it can also have an individualizing effect within the institution of the school. Moreover, while the label of disability is traditionally seen as a stigma, the disability rights movement has produced a powerful counterdiscourse of disability as a positive identity rather than tragic or shameful defect. In this discourse, the disadvantages faced by the disabled are produced by prejudice and social exclusion rather than inherent incapacity (Mayes et al., 2009). The disability rights discourse cannot be simply defined as medicalized. It provides an interpretation of ADHD that is simultaneously biological, social, and political. It enables those with ADHD to request accommodations in the language of rights, rather than as requests for help.

The neurological model of ADHD can also be deployed to challenge the idea that it is a disease or disorder. The notion that the differences found in the ADHD brain represent a form of neurodiversity rather than pathology has become popular, for instance, in the theory that proposes ADHD represents characteristics that were adaptive in nomadic hunter–gatherer societies (Hartmann, 1997). According to this theory, it is only in evolutionarily novel environments such as the modern school that “the ADHD brain” becomes a liability. Without necessarily adopting the evolutionary explanation, parents do employ the idea of ADHD as part of their child’s authentic self when making decisions about medication. For example, Singh (2005, p. 42) found that some mothers withheld medication on weekends because they wanted their sons to have the chance to be themselves. As one mother she interviewed stated, “why should we drug him on the weekend? That’s who Stuart is. If he wants to be off the walls, why not?”

The notion of medicalization tends to overlook the diversity of these forms of action and explanation because they use a common medical and neurological language. It is not simply that ordinary troubles are transformed into the disease category of ADHD through a process of medical expansion. Rather, the relationship between normality and pathology and treatment and optimization has been altered by twenty-first century biomedicine (Rose, 2006). Normality and pathology are coexistent and operate on a continuum. Medical understandings and the will to improvement are pervasive in our culture and indeed constitutive of many of our beliefs about how to live a good life and how to be a productive and responsible citizen. ADHD and its treatments are by no means unique in the way they involve medical practice and the use of pharmaceuticals beyond the clear-cut boundaries of disease (Fox and Ward, 2008).

However, a general account of medicine as a constitutive part of modern culture and ideals of citizenship runs the risk of reifying “western medicine” as a coherent and unified entity. Here the work of STS scholars such as Marc Berg and Annemarie Mol (1998) can assist, by focusing attention on the incoherencies and variations contained within medicine. They investigate the way that networks of human and nonhuman actors bring medical entities such as atherosclerosis or asthma into being in laboratories, clinics, hospitals, homes, or any other space where medicine in its broad sense is practiced. Mol (1998, p. 157) calls this “the local performance of objects’ to emphasize the specificity, variability and temporality of the entities that are brought into being by medical practice.” As she (Mol, 1998, p. 144) states in her ethnographic study of atherosclerosis and its treatment, the activities that fill hospitals do not just point to a disease, say what it is, where it is or whether it is, they also act on it, transform it and perform it: in fact “they do artherosclerosis.”

Applying this perspective to ADHD would reveal the different ADHDs brought into being in different contexts: the suburban home in Australia, the UK inner-city community health center, the U.S. neuroscience lab, the Swedish psychiatrist’s clinic, the drug company office, the high school staff meeting, and the primary school playground. It would also suggest that the spaces and times where ADHD disappears as a meaningful or at least dominant category are equally worthy of study. Another set of differences worth exploring is those that exist between the diagnosis of ADHD (based on fixed and generic criteria such as those found in the DSM-IV) and the treated condition (based on what can be done for the particular child’s most pressing problems, as interpreted by a particular physician). Thinking of ADHD as a phenomenon of local networks, produced by linkages between medical and educational practice, could also refigure the continuing debates about stimulant therapy. Rather than trying to determine whether stimulants are overprescribed or underprescribed in a general sense, this approach would promote assessment of the local value of the effects mobilized by ADHD in specific contexts.

This chapter has argued that medicine itself should be included in examinations of culture as a determinant of health and illness. It has highlighted the insights about ADHD, which can be gained through different theoretical perspectives on the styles of thought and intervention that are characteristic of medical discourse. While medicalization offers a valuable account of ADHD as a transformation of children’s problem behavior into a treatable disease, it tends to give the entity of ADHD an unwarranted uniformity and stability. Understanding medicine as culture (rather than culture as medicalized) promotes a more heterogeneous vision of biopolitics and biomedicine, in which medical thinking is constitutive of normal life and selfhood rather than an incursion into it.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780124159211000063

Socio-anthropological Contributions to the Senegalese Harm Reduction Program

Albert Gautier Ndione, in Psychotropic Drugs, Prevention and Harm Reduction, 2017

7.5.2 2012: doctoral research on the medicalization of a social deviance

The theoretical concept of medicalization means the redefinition of a problem in medical language [BER 09]. For Peter Conrad [CON 92], medicalization refers to the process through which non-medical problems become defined and treated as medical problems, generally in terms of illness or disorder. In Africa, where the whole population does not have access to treatments, the sociological approach of medicalization is not critically applied like in the Western world. In an African context marked by a generalized search for treatment, a number of social or environmental problems, which are medicalized in Northern countries, are not medicalized due to a lack of resources. Public health interventions rather seem to aim at extending the medicalization of practices, such as drug use, which have been for a long time reduced to a “social deviance”.

The idea of deviance associated with drug use is linked to the conservative policy of Senegal, which utilizes repression and the expression of normative points of views based on religious interpretations in regards to drugs and users. In 1963 and again in 1972, Senegal adopted repressive laws on cannabis (Act No 63-16 of 5 February 1963) and on narcotic drugs (Act No 72-24 of 19 April 1972). In 1997, Senegal developed a new instrument entitled the Drug Code (Act No 97-18 of 1 December 1997), which represses simple use and trafficking. The repressive policy of Senegal reinforced the negative perception of drugs and users, which was already established by a religious tradition of Muslim and then Christian origin.

The global objective of the doctoral research on the medicalization of drug use was to describe and analyze the social effects of the medical service being set up in Senegal. The thesis also discussed the social effects of this medicalization through the following specific questions:

How does medicalization modify the attribution of social deviance to drug users and with what result, especially regarding the relationships with carers?

How are users considered by the general population after the creation of a medical treatment center for drug use?

Does medicalization allow IDUs to limit their social vulnerability by reducing the lack of consideration they suffer from or, on the contrary, does it trap them in a relationship where they are dominated by doctors, and lead them to consider themselves as “sick” and “disabled”, possibly as “dependent” on a missing medicine?

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978178548272450007X

Death and Dying, Sociology of

Tony Walter, in International Encyclopedia of the Social & Behavioral Sciences (Second Edition), 2015

Abstract

Social processes that influence contemporary dying and grieving include secularization, medicalization, professionalization, and sequestration. Sociological research on awareness and trajectories of dying has influenced health care systems toward more open communication styles at the end of life, intended in part to delay the onset of social death to as near as possible to physical death. The recent shift from funerals displaying social status to displaying the deceased's individuality may be understood in terms of postmaterialism. Sociological studies of bereavement are relatively underdeveloped, though the concept of disenfranchised grief has been influential. Recent research links the politics of memorialization to sociological theorizing of collective memory, and examines how the Internet socially relocates the dying, the grieving, and the dead.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780080970868320372