Over the past two decades, advances in brain-imaging technology have allowed neuroscientists to investigate the once inscrutable human brain in greater detail than ever before. Cognitive neuroscientists’ growing understanding of the complicated mechanisms of the brain, aided by increasingly sophisticated brain-imaging tools, has brought into question the extent to which we control our behavior, and the extent to which it is determined by the physical structure of our brains. While this question promises to inform a large range of disciplines in science and medicine, one area where it has already been brought to bear with profound implications is in the emerging field of neurolaw. By introducing new perspectives regarding how responsible criminals really are for their actions, neurolaw has begun to change the criminal justice system within the United States. Neuroscience also holds the promise of changing the judicial system further through neuropredictive technologies, which could assess the likelihood that a criminal would commit a crime in the future, and fMRI-based lie detection, which could be used to determine whether a defendant or a witness is telling the truth. Finally, recent findings in neuroscience could challenge traditional legal assumptions of free will and the very foundations of the judicial system itself.
On March 30th, 1981, John W. Hinckley, Jr. shot President Ronald Reagan in an attempt to assassinate him. In the ensuing legal case, United States of America v. John W. Hinckley, Jr., the defense argued that Hinckley was not responsible for the attack on the grounds that he was schizophrenic (1). The defense introduced computer-assisted tomography (CAT) scans of Hinckley’s brain that they argued showed evidence of schizophrenia in the form of permanent brain shrinkage. Hinckley was found not guilty by reason of insanity (NGRI). The Hinckley case, which was the first time brain-imaging technology appeared within a United States court, marked the beginning of a new marriage between neuroscience and law.
Since then, both brain-imaging technologies and neuroscientists’ knowledge of the brain have improved significantly. CAT scans of the brain have been superseded by more advanced technologies such as function magnetic resonance imaging (fMRI). Neuroscientists can now link specific regions of the brain to the control of certain behaviors. These findings have radical implications for the criminal justice system within America because they raise questions of how responsible criminals really are for their actions.
For example, studies by Joshua Greene and other neuroscientists at Princeton University have shown that regions of the frontal and parietal cortexes are active in particular types of moral decision-making (2-3). Suppose that fMRI scans reveal abnormalities in those regions of a murderer’s brain. Should the murderer be found NGRI on the grounds that he cannot tell right from wrong? Or, consider a 2007 study suggesting that certain regions of the prefrontal cortex are involved in forming intentions to act (4). Should criminals with defects in this area be committed to mental institutions instead of being incarcerated? It is easy to see how findings such as these could change how the law views criminal responsibility. In fact, this is already is beginning to happen. In 2009, rapist and serial killer Brian Dugan was tried for the death penalty after pleading guilty to the 1983 rape and murder of a ten year-old girl (5). The defense introduced testimony from a University of New Mexico neuroscientist who said that he believed that fMRI scans of Dugan’s showed that Dugan had suffered from psychopathy from birth and that Dugan’s mental illness may have caused him to commit the crime. Dugan’s case was the first time that a court had accepted fMRI scans as evidence.
However, some argue that, at present, testimony including fMRI evidence should not be taken into consideration by courts (6-7). Sinnott-Armstrong et al. claim that fMRI scans showing abnormalities in the brain are not sufficient to be used as evidence for reduced sentencing or NGRI (6). According to a paper published by Sinnott-Armstrong et al., because individual brains are so different from one another, most people will have fMRI scans that differ from the group averages used as baselines in studies. Additionally, Sinnott-Armstrong argues that even highly accurate fMRI machines are prone to large numbers of false positives:
“…consider a population of 10,000 with a 1% base rate of a functional abnormality that leads to murder. […] That means that 100 people in the population have the relevant functional abnormality and 9,900 do not. If an fMRI test for this functional abnormality has 95% specificity, then it will still test positive in 5% of the 9900 who lack that abnormality, which is 495 false alarms – 495 people who test positive but do not really have the relevant functional abnormality.”
Sinnott-Armstrong also remarks that no causative link has been demonstrated between abnormal fMRI scans and behavior, only correlative ones. Finally, the paper notes that just because impairment of the brain enables a criminal impulse that does not mean that a defendant cannot ignore that impulse.
Scientists’ knowledge of what causes brain defects and mental states that are linked to criminal activity is also improving. Recent studies have shown a correlation between adverse experiences in childhood and altered brain function as an adult (8-10). The 1995-96 Adverse Childhood Experiences (ACE) Study examined the childhoods of 9,508 adults and assigned them scores based on the extent to which they had undergone “adverse childhood experiences,” including being subjected to physical, verbal, or sexual abuse; witnessing domestic violence; having parents who were divorced, alcoholic, addicted to drugs, incarcerated, or suicidal; and living in poverty (11). The ACE Study and subsequent studies found that adults with higher ACE Scores were, among other things, more likely to have attempted suicide, engage in sexually risky behaviors, be addicted to tobacco, alcohol, or other substances, and suffer from depression, schizophrenia, or other mental illnesses (11). In 2006, researchers including Vincent J. Felitti and Robert F. Anda, two of the researchers who conducted the original ACE Study, found that adults with higher ACE Scores overwhelmingly demonstrated “impairment in multiple brain structures and functions” including the amygdala, the hippocampus, and the prefrontal cortex, suggesting that these abnormalities are behind the aberrant behavior of people with high ACE Scores (8). Such research on the causes of brain defects may soon play a role in how judges and juries determine the culpability of criminals. It may also play a part in the actual treatment of criminals, notes leading neuroscientist and co-director of the MacArthur Foundation Law and Neuroscience Project Michael Gazzaniga (12). “The goal” he says, “is to understand abnormal states [of people with brain defects] and attempt to design therapies and other interventions that might lead them to a so-called normal status” (12).
Neuroscience could also inform the criminal justice system by assessing the likelihood that criminals will recidivate by committing violent crimes in the future, or what Nadelhoffer et al. term “neuropredicting” (13). Nadelhoffer et al. argue that increasingly sophisticated methods of data collection, e.g. the “neural intermediate phenotype strategy” used to collect information about how genes affect brain function, and analysis, e.g. the multi-voxel pattern analysis technique used to interpret fMRI scans, combined with researchers’ growing understanding of the brain, are making neuroprediction progressively more feasible. Nadelhoffer et al. also raise several interesting questions regarding the legality and morality of neuroprediction. Would nonconsensual neuroprediction be in violation of the Fourth Amendment by constituting an unreasonable and overly intrusive search? Would it be considered a violation of the Fifth Amendment on the grounds that it forces a person to give witness against his or her self? What if neuroprediction was used not just on criminals but also on civilians? Could we force individuals who were determined to have a high risk of committing violent crime in the future to undergo therapy or even institutionalize them before they broke the law? These are the types of questions that will likely continue to arise as neuroscience’s influence upon our society grows.
A more direct way in which neuroscience could affect the legal system is through fMRI-based lie-detection. In laboratory settings, scientists have been able to use fMRI to detect lies with upwards of ninety-percent accuracy, causing many to consider whether fMRI might be employed as a form of lie-detection in court (14). To date, no United States court has allowed fMRI scans to be used legally as a form of lie-detection (15). Most recently, in August, a Maryland judge ruled that lie-detection using fMRI would not be allowed as evidence in the case of Gary Smith, a former Army Ranger who allegedly killed his roommate (15). However, in 2001, an Iowa court allowed the use of electroencephalography (EEG) based lie detection (6). Currently, the general attitude of the neuroscience community toward fMRI-based lie-detection seems to be that it should not be admitted in court due to the fact that it has not proven to be accurate enough in real-world (as opposed to laboratory) settings (16-19). However, University of Virginia School of Law professor Fred Schauer argues that although the accuracy of fMRI-based lie-detection is not sufficient enough to be used as evidence in the conviction of criminals, it is sufficient to raise the possibility that a defendant is innocent due to the presence of reasonable doubt (19).
Neuroscience’s recent engagement with law over the question of how responsible we are for our actions has spilled into the age-old debate over the existence of free will. Dartmouth College philosopher and neuroscientist Adina Roskies argues that while neuroscience can provide evidence that our brains are deterministic, it cannot provide definitive proof for the absence of free will (20). New findings in neuroscience, Roskies says, are likely to change only how people view the existence of free will. Greene and Cohen disagree and argue that cognitive neuroscience will increasingly show that we are purely nature and nurture (21). Greene and Cohen contend that as the public begins to realize that “the combined effects of genes and environment determine all of our actions,” the criminal justice system will change from a retributive one that seeks to punish lawbreakers to a restorative one that seeks to heal not only the damage caused by criminals, but also the criminals themselves.
Contact Christian Nakazawa at firstname.lastname@example.org or HIS Supervising Teacher, Lisa Veltri, at email@example.com
1. S. Taylor, Jr., CAT scans said to show shrunken Hinckley Brain. The New York Times, 2 June, A1 (1981).
2. J. D. Greene, R. B. Sommerville, L. E. Nystrom, J. M. Darley, J. D. Cohen, “An fMRI Investigation of Emotional Engagement in Moral Judgment,” Science 293, 2105-2108 (2001).
3. J. D. Greene, L. E. Nystrom, A. D. Engell, J. M. Darley, J. D. Cohen, “The Neural Bases of Cognitive Conflict and Control in Moral Judgment,” Neuron 44, 389-400 (2004).
4. J. Haynes et al., Reading Hidden Intentions in the Human Brain. Curr. Biol. 17, 323-328 (2007).
5. V. Hughes, “Science in court: Head case,” Nature 464, 340-342 (2010).
6. W. Sinnott-Armstrong, A. Roskies, T. Brown, E. Murphy, “Brain Images as Legal Evidence,” Episteme 5, 359-373 (2008).
7. S. Morse, “Brain Overclaim Syndrome and Criminal Responsibility: A Diagnostic Note” Ohio State Journal of Criminal Law, 3, 397-412 (2006).
8. R. F. Anda et al., “The enduring effects of abuse and related adverse experiences in childhood,” Eur. Arch. Psy. Clin. N. 265, 174-186 (2006).
9. M. H. Teicher, C. M. Anderson, A. Polcan, “Childhood maltreatment is associated with reduced volume of the hippocampal subfields CA3, dentate gyrus, and subiculum,” Proc. Natl. Acad. Sci. U.S.A. 109, E563-E572 (2012).
10. M. M. Kishiyama, W. T. Boyce, A. M. Jimenez, L. M. Perry, R. T. Knight, “Socioeconomic Disparities Affect Prefrontal Function in Children,” J. Cogn. Neurosci. 21, 1106-1115 (2009).
11. V. J. Felitti et al., “Relationship of childhood abuse and household dysfunction to many of the leading causes of death in adults: The Adverse Childhood Experiences (ACE) Study,” Am. J. Prev. Med. 14, 245-258 (1998).
12. M. S. Gazzaniga, “The Law and Neuroscience,” Neuron 60, 412-15 (2008).
13. T. Nadelhoffer et al., “Neuroprediction, Violence, and the Law: Setting the Stage,” Neuroethics 5, 67-99 (2012).
14. F. A. Kozel et al., “Detecting Deception Using Functional Magnetic Resonance Imaging,” Biol. Psychiat. 58, 605-613 (2005).
15. M. Laris, “Debate on brain scans as lie detectors highlighted in Maryland murder trial,” The Washington Post, 26 August, E2 (2012).
16. P. R. Wolpe, K. R. Foster, D. D. Langleben, “Emerging Neurotechnologies Lie-Detection: Promises and Perils,” Am. J. Bioethics 5, 39-49 (2005).
17. M. C. T. Lee et al., “Lie detection by functional magnetic resonance imaging,” Hum. Brain Mapp. 15, 157-164 (2002).
18. C. Davatzikos, K. Ruparel, Y. Fan, D. Shen, M. Acharyya, “Classifying spatial patterns of brain activity with machine learning methods: application to lie detection,” Neuroimage 28, 663-668 (2005).
19. F. Schauer, “Can Bad Science be Good Evidence: Neuroscience, Lie-Detection, and Beyond” Cornell Law Rev. 95, 1191-1219 (2010).
20. A. Roskies, “Neuroscientific challenges to free will and responsibility,” Trends Cogn. Sci. 10, 419-423 (2006).
21. J. D. Greene, J. D. Cohen, “For the law, neuroscience changes nothing and everything.” Philos. T. R. Soc. Lon. B 359, 1775-1785 (2004).