Publication
Article
Psychiatric Times
Author(s):
The study of moral cognition now carries age-old questions and observations into the era of neuroimaging and cognitive neuroscience by the shift toward a capacity for asking, “What are the structures necessary and sufficient for moral reasoning?”
. . . [F]or all our ignorance, the physical basis of moral judgment is no longer a complete mystery. We’ve not only identified brain regions that are “involved” in moral judgment, but have begun to carve the moral brain at its functional joints.
-Joshua Green
1
For centuries, the exploration of morality consisted of philosophical inquiries with occasional unrelated evidence from clinical observations such as John Harlow’s account of Phineas Gage or the ignominious outcomes of the Freeman lobotomies of the 1940s. From cases such as these, a sense could be garnered that certain injuries might affect moral and social behavior, but an integrated understanding of the philosophical questions regarding functional neuroanatomy was almost unimaginable.
Neuroscientific advances of the past half century have changed the lexicon entirely. The study of moral cognition now carries these age-old questions and observations into the era of neuroimaging and cognitive neuroscience by the shift toward a capacity for asking, “What are the structures necessary and sufficient for moral reasoning?” While these findings may not immediately change day-to-day clinical practices, they already inform conceptualization of numerous neuropsychiatric and behavioral conditions and promise further evolution of concepts and practices in the future.
Perhaps more important, these questions and their investigation get at fundamental aspects of humanness and the understanding of ourselves and our place in the world.
Focus on a theory: dual processing
Philosophers and neuroscientists have struggled over the years to create overarching theories to explain the available evidence on moral intuitions. This is more of a challenge than it may seem because much of the available data present what appear to be contradictory findings about our moral preferences. For example, one might be willing to commit a certain crime under one set of circumstances but unwilling under another (think Jean Valjean).
A recent theory developed over the past decade by philosopher-neuroscientist Joshua Greene identifies 2 sets of functional brain regions that might explain these apparent contradictions. Known as dual processing theory, this explains much of our moral machinery: knee-jerk, emotionally driven, intuitive function versus rational, cogitating, step-back-and-consider function. According to this theory, the first is the default setting, and the other comes into play to modulate as needed.
One of the substantial problems in creating a moral theory is explaining why one person intuitively acts very differently in two moral scenarios in which the outcome would be the same (eg, murder of 1 person to save 5). Dual processing theory initially came about to explain this tension based on a set of thought experiments-trolley problems-that have been studied by philosophers since the 1970s.2 Greene was concerned with the tension between the emotional and cognitive responses to moral problems as exemplified by people’s responses to trolley problems, a set of thought experiments created by Philippa Foot3 and Judith Thomson.4
Variation 1. You are near a train track and a train is about to hit 5 people. You can pull a lever to switch the train to a different line, but by doing so, the train would collide into a sole person walking along the other line. Would you do it?
Variation 2. You are on a bridge above a train track and the train is about to hit 5 people. You are standing next to a large man with a heavy backpack. Would you push the man off the bridge to land, thus stopping the train and saving the 5 people?
The majority of people are willing to undertake the first murder but not the second. The net result is the same, so why the intuitive discrepancy? What Greene and subsequent investigators have worked out is that there are essentially 3 components to these problems that are important in determining our response: action versus omission, means versus side effect, and personal force versus no personal force. The balance of these qualities that make up a moral conundrum determine much of our sense of the permissibility of the action.
When our emotional heartstrings are pulled, by things such as physical touch, viscerality, emotional closeness, or other sentimental components, we respond with a knee-jerk moral preference. In this instance, decisions are made with an emotional or rule-based justification, as in, “Well, you just shouldn’t steal,” or, “That’s horrible. It’s wrong to murder. Period.” Pushing the man off of the bridge is visceral; you can picture the proximity, the fall, and the outcome-and it is chilling. With greater distance, both physical and emotional, it is easier to appreciate the sense of the greater good. The other crucial aspects-namely, means versus side effect and action versus omission-have been teased out through a range of other trolley problems and thought experiments that have pinned down the particular conditions that predispose one to be more willing to kill one individual to save many.2
Whether harm is purposeful or an unintended consequence is important to us. This begins to be parsed in the case of the two trolley problem variations. In the first, the train is diverted to a second track, where it just happens there is an innocent passerby. Juxtapose this to the second case, wherein the large man is pushed and becomes the means of stopping the train. Often, our intuitions permit the former but not the latter. This contrast plays out in numerous clinical situations as well. For example, a physician who titrates morphine to euthanize a patient is often considered immoral, whereas “allowing them to pass” as a side effect of escalating dosage for pain management is thought to be morally permissible. The interaction between these factors affects our moral intuition and determines why one is permissible and the other is not. When we are less viscerally bothered by the circumstances, we are far more able to calculate benefits versus detriments because we are not as instinctually driven by our emotional morality default.
The prefrontal cortex and its many roles
Neuroscientific investigations suggest that the ventromedial prefrontal cortex (PFC) is at the core of this default emotional baseline, while the dorsolateral PFC forms the center of the cognitive system that allows us to step back and weigh the benefits and risks of an action in its wider con-text. Although it may seem strange that the emotional stance is the baseline while the rational is the afterthought, gut reactions are evolutionally necessary-they allow us to respond to danger quickly and in the best interest of those near to us.2 The slower, cognitive oversight allows us to modify behavior on the basis of evidence.
Evidence for the primacy of these regions within the dual structure has been growing. The ventromedial PFC has been shown to engage during the solving of moral problems, particularly when responding to emotionally trying moral scenarios.5,6 Ventromedial PFC lesions are correlated with more utilitarian responses to moral problems, while diseases such as frontotemporal dementia result in less importance being placed on emotional factors.
In contrast, the dorsolateral PFC is a structure involved in cognitive control, lying, and problem solving. One could think of it, in this context, as cognitively tamping down our innate, reactionary moral system. Utilitarian responses to moral problems activate the dorsolateral PFC. When it is damaged, global deficits on intelligence measures have been seen when the dorsolateral PFC is damaged.7,8 Increasing cognitive burden, which engages the dorsolateral PFC, interfered with utilitarian responses; occupying visual processing, thought to decrease the emotional viscerality of experience by interfering with visual imagination, yields more utilitarian responses.9,10
Both the ventromedial and dorsolateral PFC have important connections to other brain regions. Mediation between these two structures appears to occur via the anterior cingulate cortex, lesions of which can cause prefrontal syndrome–like disturbance.5
The ventromedial PFC is interconnected with limbic structures, including the amygdala, the hippocampus, and the cingulate cortex. These structures are associated with emotion and learning. The amygdala is an emotional warning signal that is subsequently interpreted and integrated by the ventromedial PFC; it is engaged during normal emotional and moral emotional tasks. Abnormalities of the hippocampus appear to be involved in psychopathology and violence.11,12
Other structures have been consistently implicated in moral cognition, exemplifying the complexity of moral reasoning and its integration of myriad domains (eg, cognition, learning and memory, social awareness, fear/reward). Structures such as the parietal and temporal cortices, basal ganglia, thalamus, and insula are required for many nonmoral roles, implying the integrated nature of moral cognitive function.
The temporal lobe is important for specific aspects of moral cognition. Findings indicate that psychopaths, violent criminals, individuals with aggression related to temporal lobe epilepsy, and patients with antisocial personality or conduct disorders have functional or anatomical temporal lobe abnormalities.13-15
We are hard-pressed to find a specific locus of moral reasoning. Rather, many regions are necessary but not sufficient in and of themselves for functional moral cognition. Some structures, it seems, have unique properties that when damaged impact moral thinking specifically, with minimal or no impact on other aspects of cognition or emotion.
Dual process theory beautifully explains particular aspects of our moral software, and the research further delving into the complexities of morality have identified loci of requisite hardware. From this quickly expanding literature, we begin to understand why we may be inclined to make certain sacrifices in one set of ethical conditions but not another, key questions in understanding our moral workings. Far from being an isolated philosophical thought experiment, following these lines of inquiry down the rabbit hole has opened up new directions of exploration by bridging philosophy and neuroscience toward the goal of further whittling away at the functional joints of the human mind and brain.
Dr Hauptman is a Psychiatry Resident at UT Southwestern Medical Center at the Seton Family of Hospitals in Austin, Tex. He reports no conflicts of interest concerning the subject matter of this article.
1. Greene JD. The cognitive neuroscience of moral judgment. http://www.wjh.harvard.edu/~jgreene/GreeneWJH/Greene-CogNeuroIV-09.pdf. Accessed March 20, 2014.
2. Greene J. Moral Tribes: Emotion, Reason, and the Gap Between Us and Them. New York: Penguin Press; 2013.
3. Foot P. The problem of abortion and the doctrine of double effect. In: Virtues and Vices. Oxford, UK: Blackwell; 1978.
4. Thomson JJ. The trolley problem. Yale Law J. 1985;94:1395-1415.
5. Greene JD, Sommerville RB, Nystrom LE, et al. An fMRI investigation of emotional engagement in moral judgment. Science. 2001;293:2105-2108.
6. Young L, Saxe R. An FMRI investigation of spontaneous mental state inference for moral judgment. J Cogn Neurosci. 2009;21:1396-1405.
7. Young L, Keonigs M. Investigating emotion in moral cognition: a review of evidence from functional neuroimaging and neuropsychology. Br Med Bull. 2007;84:69-79.
8. Barbey AK, Colom R, Grafman J. Dorsolateral prefrontal contributions to human intelligence. Neuropsychologia. 2013;51:1361-1369.
9. Greene JD. Why are VMPFC patients more utilitarian? A dual-process theory of moral judgment explains. Trends Cogn Sci. 2007;11:322-323.
10. Amit E, Greene JD. You see, the ends don’t justify the means: visual imagery and moral judgment. Psychol Sci. 2012;23:861-868.
11. Laakso MP, Vaurio O, Koivisto E, et al. Psychopathy and the posterior hippocampus. Behav Brain Res. 2001;118:187-193.
12. Boccardi M, Ganzola R, Rossi R, et al. Abnormal hippocampal shape in offenders with psychopathy. Hum Brain Mapp. 2010;31:438-447.
13. Kruesi MJ, Casanova MF, Mannheim G, Johnson-Bilder A. Reduced temporal lobe volume in early onset conduct disorder. Psychiatry Res. 2004;132: 1-11.
14. Raine A, Park S, Lencz T, et al. Reduced right hemisphere activation in severely abused violent offenders during a working memory task: an fMRI study. Aggress Behav. 2001;27:111-129.
15. David AS, Fleminger S, Kopelman MD, et al. Lishman’s Organic Psychiatry: A Textbook of Neuropsychiatry. 4th ed. Oxford, UK: Wiley-Blackwell; 2009.