Being Brains: Making the Cerebral Subject

Being Brains: Making the Cerebral Subject

Being Brains: Making the Cerebral Subject

Being Brains: Making the Cerebral Subject

eBook

$2.99  $17.99 Save 83% Current price is $2.99, Original price is $17.99. You Save 83%.

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers

LEND ME® See Details

Overview

This “interesting, informative, and provocative book” explores the pervasive influence of neuroscience and “the view that we are essentially our brains” (History and Philosophy of the Life Sciences).
 
Being Brains offers a critical exploration of neurocentrism, the belief that “we are our brains,” which came to prominence in the 1990s. Encouraged by advances in neuroimaging, the humanities and social sciences have gravitated toward the brain as well, developing neuro-subspecialties in fields such as anthropology, aesthetics, education, history, law, sociology, and theology. Even in the business world, dubious enterprises such as “neuromarketing” and “neurobics” have emerged to take advantage of the heightened sensitivity to all things neuro.
 
While neither hegemonic nor monolithic, the neurocentric view embodies a powerful ideology that is at the heart of some of today’s most important philosophical, ethical, scientific, and political debates. Being Brains examines the internal logic of this new ideology, as well as its genealogy and its main contemporary incarnations.
 
Being Brains was chosen as the 2018 Outstanding Book in the History of the Neurosciences by the International Society for the History of the Neurosciences.

Product Details

ISBN-13: 9780823276080
Publisher: Fordham University Press
Publication date: 08/08/2019
Sold by: Barnes & Noble
Format: eBook
Pages: 304
File size: 2 MB

About the Author

Fernando Vidal is Research Professor of ICREA (Catalan Institution for Research and Advanced Studies) at the Medical Anthropology Research Center, Rovira i Virgili University, Tarragona, Spain.Francisco Ortega is Professor at the Institute for Social Medicine and Research Coordinator of the Rio Center for Global Health at the State University of Rio de Janeiro, Brazil. He is also Visiting Professor at the Department of Global Health and Social Medicine at King’s College, London.

Read an Excerpt

CHAPTER 1

Genealogy of the Cerebral Subject

What "Is" the Cerebral Subject?

It may well be that nobody believes they literally are their brain. But when influential people proclaim it, we must take them at their word. Together with the brain in a vat, brain transplantation is one of the favorite thought experiments of philosophers of personal identity (Ferret 1993). It is usual to observe that if the brain of A were transplanted into the body of B, then A would gain a new body, rather than B a new brain. Commenting on that commonplace, Michael Gazzaniga (2005, 31), a leading neuroscientist, serenely asserted: "This simple fact makes it clear that you are your brain." Yet what we have here is neither a fact nor anything simple; it is a profession of faith. The neurophilosopher Paul Churchland "carries in his wallet a colour picture of his wife. Nothing surprising in it," remarks the sociologist Bruno Latour, "except it is the colour scan of his wife's brain! Not only that," he continues, "but Paul insists adamantly that in a few years we will all be recognizing the inner shapes of the brain structure with a more loving gaze than noses, skins and eyes!" (Latour 2004, 224). Gazzaniga, Churchland, and many others who make similar claims express a widespread belief. So widespread indeed, that saying, as the New York Times cultural commentator David Brooks did in June 2013, that "the brain is not the mind" immediately generates a flutter of suspicion about a religious and antiquated — even reactionary — dualistic antineuroscience backlash as well as self-confident reassertions of the assumption that "the mind is what the brain does" (Brooks 2013, Marcus 2013, Waldman 2013). The examples could be multiplied.

What is at stake here? Neither science nor ascertainable facts but an idea of the human being, the anthropological figure of the cerebral subject — an "ideology" in the plain sense of a set of notions, beliefs, values, interests, and ideals. Like any ideology, this one offers varieties and internal debates and inspires practices that are not necessarily compatible. Yet there is unity in diversity, so that the cerebral subject allows for a fairly unequivocal characterization, and even for a sort of formula: "Person P is identical with person P* if and only if P and P* have one and the same functional brain" (Ferret 1993, 79). To have the same brain is to be the same person, and the brain is the only part of the body we need in order to be ourselves. As the philosopher Roland Puccetti (1969, 70) memorably put it: "Where goes a brain, there goes a person." Puccetti was not saying that a person is his or her brain but that insofar as the brain is the physical basis of personhood, one cannot be separated from the other. The brain is the somatic limit of the self, so that, as regards the body they need to be persons, humans are specified by the property of "brainhood" (Vidal 2009a), that is, the property or quality of being, rather than simply having, a brain.

Now we must go beyond definitions and ask, first, if there are any real, concrete cerebral subjects and, second, which magnitude (from hegemonic to inconsequential) the brainhood ideology may actually be said to have. In a first approximation, there is one answer to both questions, and it is: It depends. Yes, real people can see themselves as cerebral subjects and behave accordingly — but not necessarily all the time. The weight of the ideology depends on contexts and criteria.

The reason for thinking in terms of a "subject" is that views about what humans essentially are go hand in hand with concrete decisions about how to study them and how to treat them, and these decisions implicate processes of "subjectivation" (Foucault 1983; "subjectification" is sometimes also used). These are processes involved in the production of ways of being, in forms of reflexivity and "technologies of the self" (Foucault 1988); they make individuals what they are and contribute to shape their behavior and experience. In our case, then, they are processes whereby people think of themselves and others as primarily determined by their brains — and act, feel, and believe accordingly. Individuation and subjectivation are rooted in sociohistorical contexts and, as we shall see, do not exclude the coexistence of different anthropological figures: cerebral selves, psychological selves, chemical selves, and others.

At the individual level, cerebral subject is not a label that can be permanently affixed to anyone but is rather a way of denoting notions and practices that may be operative in people's lives some of the time. In practice, no one conception of the human is monolithic or hegemonic in a given culture, and persons are not one kind of subject alone. For example, the developmental biologist Scott F. Gilbert (1995) contrasted four biological views of the body/self — the neural, immunological, genetic, and phenotypic — and put them in correspondence with different models of the body politic and different views of science. He thus highlighted how political debates mirror disputes over which body, and consequently which self, are the true body and self. "Immune selfhood" has a very rich history of its own (Tauber 2012), but writing in the mid-1990s, Gilbert noted that the genetic self had been recently winning over the other selves. These may be theoretical constructs, but they have real consequences. Thus, as Gilbert points out, in controversies over abortion, the self may be defined genetically (by the fusion of nuclei at conception), neurally (by the onset of the electroencephalographic pattern or some other neurodevelopmental criterion), or immunologically (by the separation of mother and child at birth). In each case, when affected by concrete medical decisions, individuals accomplish the "self" whose definitional criteria were used to reach the decisions.

Thus, it makes sense to refer to a "genetic self" when people's life and self-concept are largely defined by genetic conditions or by genetic testing, screening, and treatment (e.g., Peters, Djurdjinovic, and Baker 1999). Individuals are unlikely to reduce themselves and others to their genetic makeup. However, scientific authorities may suggest such a reduction in statements epitomizing beliefs that permeate a research field, inspire its quest, legitimize its promises, nourish expectations, and orient policy. This was the case when James D. Watson, the codiscoverer of the structure of DNA, uttered for Time an assertion that has been quoted hundreds of times: "We used to think our fate is in our stars. Today we know, in large measure, our fate is in our genes" (Jaroff 1989). The oracular claim was supposed to be universally valid, independently of particular individuals' sense of self. By the time the Human Genome Project was completed in 2004, the gene had long been a cultural icon; the HGP itself participated in the hype that the sociologists of science Dorothy Nelkin and M. Susan Lindee (1995) called the "DNA Mystique"— one that involved a basic posture of genetic essentialism and offered an overly optimistic picture of the future clinical applications of genetic research (Hubbard and Wald 1993).

In spite of the increasing convergence of neuroscience and genomics, by the late 1990s the brain had largely supplanted the genome as the source of foundational explanations for human features and behaviors as well as the source of scientific hype. Such a shift may appear justified. Since the brain and the nervous system seems more directly relevant than genetics to many of the philosophical and ethical questions raised by the Western philosophical tradition, including issues of personal identity, they are more likely to be felt as constitutive of one's self. Some occasions may prompt or sustain such a special relation. Thus, while people with genetic afflictions have been observed to "hiss and boo at pictures of genes or enzymes that cause these afflictions," sufferers of mental illnesses react to brain images of patients diagnosed with depression of schizophrenia with "care and concern," as if the image represented both the affliction and "the suffering of the afflicted" (Dumit 2003, 44–45).

As we shall see, such differences in attitude, as well as the precedence of brain over genes as far as human individuality is concerned, have deep roots in the history of notions of personal identity. Yet, again, this does not mean that brainhood is hegemonic. For example, on the basis of ethnographic research in a neuro-oncology clinic, the sociologist of science Sky Gross (2011) shows that while most brain tumor patients admit that the brain is the seat of "who they are," they tend to consider it as just another diseased organ. We must insist on this point, to which we return below, because there has been concern about the empirical accuracy and the interpretive traction of "totalising accounts of the neurological as determining subjectivity, as if the brain is the epicentre of personhood" (Pickersgill, Cunningham-Burley, and Martin 2011, 362).

Notions such as cerebral subject, brainhood, or neurochemical self are not meant to suggest that a neurobiological perspective dictates views of subjectivity always and absolutely but that, in some times and contexts, it effectively does, occasionally at a very large scale. The sociologist Nikolas Rose's example for neurochemical selves is the well-documented fact that millions of people around the world have come to think about sadness "as a condition called 'depression' caused by a chemical imbalance in the brain and amenable to treatment by drugs that would 'rebalance' these chemicals" (Rose 2003, 46; see here Chapter 3). However, as with "genetic self," it should be obvious that, in real life, everyday ontologies (in the loose sense of mainly implicit "theories about being") coexist, both inside a society and within a single individual. We shift registers in our ways of acting, experiencing, and interacting as well as thinking and speaking about ourselves and others, and this is why psychotherapies and antidepressants can live happily together, if perhaps not "ever after."

The coexistence of such ontologies and their related practices corresponds to what happens in the diachronic and historical dimension. When a phenomenon or area of knowledge is neurologized, it does not ipso facto cease to be what it previously or otherwise was. For example, in the neurobics industry examined below, "brain jogging" simply translates into training the mind, and the exercises proposed are basically the same as those long peddled to improve mental capacities. Nevertheless, when these exercises are relabeled neurobics, they realize the ideology of the cerebral subject. It may be a superficial instantiation of that ideology, where the neuro is no more than a marketing gimmick. That, however, does not abolish the fact that what is sold and bought belongs to a neuro business based on people believing (or at least being told) that they are essentially their brains.

In a medical context, individuals may share a condition but not its interpretation. For example, in her study of bipolar disorder patients, the anthropologist Emily Martin (2009) describes the clash between a dominant reductionist model and the individuals who challenged the idea that neurobiology sufficed to explain their experience. Grassroots diversity thus coexists with a more homogenous official discourse. As is well known, much of psychiatry, including scientists at the head of major national mental health agencies, assert that there are no mental diseases, only brain diseases. Different consequences could follow — one being an emphasis on pharmacological medication and a restriction of access to psychotherapies, with a huge impact on people's lives. A development such as the neurodiversity movement (Chapter 3 here) can only happen in a world where "mental disorders" have been redefined as "brain disorders that primarily affect emotion, higher cognition and executive function" (Hyman 2007, 725). In such a context, psychiatric patients are approached mainly as cerebral subjects, and this may contribute to modulate their self-understanding and how they live their lives.

However, the neuroscientific consensus does not automatically translate into public consent, and research confirms commonsense intuitions about the variety and coexistence of views and practices of the self. Emily Martin (2010, 367) noted that the uptake of brain-based explanations outside the neurosciences and in the wider public is "uneven" and that there is no full takeover by "a newly dominant paradigm." Such heterogeneity exists side by side with the development of brain-centered interventions in medicine, in the workplace, and in schools — interventions that may take place independently of how particular individuals understand themselves.

The sociologist Martin Pickersgill and his colleagues (2011) investigated how people draw on neuroscience and neuro ideas to articulate self-understanding. Working with patients suffering from epilepsy, head injury, and dementia as well as with neuroscientists and other professional groups (teachers, counselors, clergy, and foster care workers), they showed that individuals turn their attention to (popular) neuroscience mainly after some kind of neurological event, for example, a brain hemorrhage. This contingent interest, however, does not imply attributing to neuroscience an absolute capacity to define or explain subjectivity. Overall, attitudes are governed by pragmatism and personal relevance; rather than altering notions and practices of the self, neuroscientific concepts "seemed to simply substantiate ideas already held by individuals." The brain thus emerges "as an object of mundane significance," which sometimes helps one understand oneself but is "often far from salient to subjective experience" (Pickersgill, Cunningham-Burley, and Martin 2011, 358, 361–362). Using online questionnaires with Dutch adults diagnosed with ADHD, the sociologists Christian Broer and Marjolijn Heerings (2013) also noticed that although those individuals were interested in neurobiological explanations, they did not reduce their condition to a brain phenomenon. In the framework of the Dutch tradition of public debate and dissent over mental health issues, neurobiology did not colonize subjectivity and was invoked in different ways: as explanation or excuse but also as opening the possibility of governing the self "in the name of the brain" (Rose and Abi-Rached 2013, 8). A study of adults diagnosed with ADHD documented parallel discourses of self-regulation that did not rely on "brain talk" (Broer and Heerings 2013, 61). In Canada, adults diagnosed with major depression or bipolar disorder were asked their ideas about the potential role of neuroimages in stigma mitigation, moral explanations of mental illnesses, and the legitimation of psychiatric symptoms. The resulting interviews show the complex and ambivalent ways in which individuals integrate brain-based notions of mental disorders into their self-understanding; some assumed neurobiological explanations of their disorder yet struggled against pharmaceutical treatments (Buchman et al. 2013).

Studies with other populations produce similar results. Adolescents' explanations of their own behaviors and mental health issues emphasize personal, familial, and social contexts, rarely incorporating the brain or biology (Choudhury, McKinney, and Merten 2012). This may be partly attributable to a lack of information. When informed, however, teens do not refuse to include biological factors in their understanding of adolescent behavior. Rather, confronted with an overwhelmingly negative view of the "teenage brain" as defined by the incapacity to exert control over high-risk pleasure-seeking behaviors or by a deficit in the synchronization of cognition and affect (e.g., Steinberg 2008), they call for neuroscience to contribute to a positive view of their age of life and, in any case, do not generally see behavior in purely biological terms. In turn, on the basis of conversations with three groups (undiagnosed, diagnosed with ADHD but medicated, or diagnosed but not medicated) Ilina Singh (2013) described how children, including those of the two latter groups, did not subordinate their I to brain-based explanations but tended to depict the role of the brain in their lives in ways that emphasized personal agency. She thus confirmed that encounters with neuroscientific discourses or technologies do not necessarily cerebralize subjectivity. Similarly, fieldwork in a laboratory conducting fMRI research with children diagnosed with ADHD, learning disabilities, autism, and Tourette syndrome documented how subjects "appropriate lab-based descriptions of neurological difference to their own purposes, claiming a positive identity for themselves," and how "the effects of laboratory research and the metaphors used to describe them may serve expansive purposes in the practices of those who see their subjectivity embedded in research findings" (Rapp 2011, 3, 22).

(Continues…)


Excerpted from "Being Brains"
by .
Copyright © 2017 Fordham University Press.
Excerpted by permission of Fordham University Press.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

To Begin With

1. Genealogy of the Cerebral Subject

2. Disciplines of the Neuro

3. Cerebralizing Distress

4. Brains on Screen and Paper

"Up for Grabs"

Acknowledgments

Notes

Bibliography

Index
From the B&N Reads Blog

Customer Reviews