Why be moral? This question is fundamental for ethics, because even if people can figure out what
are the right things to do, we can still ask why they would in fact do those things. The problem of
moral motivation—what makes people do what is right—has two classes of answers, rationalist and
sentimentalist. The traditional philosophical responses to the problem have been rationalist: we
should be moral because it would be irrational to do otherwise. The rationality of morality might
derive from a priori truths about what is right, or from arguments that it is rational for people to agree
with others to be moral. The philosopher Sean Nichols argues that a major problem for rationalism is
that psychopaths, with no impediments in abstract reasoning, nevertheless see nothing wrong in
harming other people.
Nichols argues convincingly that what is wrong with psychopaths is not their reasoning but their
emotions. I mentioned earlier in this chapter the theory that psychopathy, whose symptoms include
antisocial behavior, lack of guilt, and poverty of emotions, is the result of impairments to emotional
learning that derive from disrupted functioning of the amygdala.
According to Nichols, an adequate account of ethical thinking must explain how emotion plays a
role in linking moral judgment to motivation, while also allowing a place for reason in moral
judgment. His explanation of ethical norms is cultural and historical: “Norms are more likely to be
preserved in the culture if the norms resonate with our affective systems by prohibiting actions that
are likely to elicit negative affect.” Norms that prohibit harm to others are virtually ubiquitous across
cultures because of this “affective resonance.” The adoption of norms enables us to reason about what
is right and wrong, but these norms have an emotional underpinning that intrinsically provides a
connection between morality and action: people are moral because of their emotional commitment to
normative rules.
What is missing from Nichols's otherwise plausible account is an explanation of why people have
such a basic emotional reaction to harm to others. There is no mystery concerning why you do not
want harm to yourself, because experiences such as pain and fear are intrinsically negative.
Appreciating harm to others might be achieved by abstract analogical reasoning, but there is no
guarantee that such reasoning will be motivating: I may understand that you experience pain and fear,
but why should I care? What makes emotional moral learning work?
As my discussion of empathy indicated, mirror neurons provide the plausible missing link between
personal experience and the experience of others. People not only observe the pain and disgust of
others; they experience their own versions of that pain and disgust, as shown by the mirroring activity
in cortical regions such as the insula and the anterior cingulate. Normal children do not need to be
taught moral rules as abstract theological principles (“Thou shalt not kill!”) or rational ones (“Act
only in ways that could become universal”). Normal children do not need to reason about why harm
is bad for other people; they can actually feel that harm is bad. Thus mirror neurons provide
motivation not to harm others by virtue of direct understanding of what it is for another to be harmed.
It would be elegant if there were evidence that psychopaths have deficiencies in the functioning of
their mirror neurons, but the relevant experiments have not yet been done. It is possible that
psychopaths' deficits in emotional learning that involve disrupted functioning of the amygdala are
partly due to mirror neuron malfunctioning. Children who are incapable, for genetic or environmental
reasons, of feeling the pain of others will not be able to become motivated to follow rules that direct
them not to harm other people. Blair and his colleagues discuss moral socialization in terms of
aversive conditioning, as when caregivers punish children for their wrongdoings. They claim that the
sadness, fearfulness, and distress of a victim act as a stimulus to instrumental learning not to produce
harm. The involvement of mirror neurons shows why instrumental learning can be especially effective
when people can fully appreciate what is negative about their behavior.
I have argued that mirror neural mechanisms contribute to solution of the philosophical problem of
moral motivation by showing how biologically normal people naturally have at least some
understanding of and concern about harm to other people. Feeling the pain of others is not the whole
story of moral motivation, for there are many cognitive and social additions in the form of rules and
expectations that can be built on top of neural mirroring. The motivating reason to be moral is not just
that morality is rational, but rather that feeling the pain of others is biologically part of being human.
For ethics, the capacity to care about others is at least as important as the ability to reason about them.
Caring has a neural basis, in that mirror neurons enable brains to get a kind of direct
comprehension of the pain and emotions of others. Mirror neurons are neither necessary nor sufficient
for ethical evaluations, but they help enormously to enable children and even adults to appreciate the
experiences of others. Hence they provide a causal basis for empathy and moral motivation,
encouraging us to feel and care about the pain of others and to act so as to alleviate it. The capacity
for such caring is built into our neural circuitry, but needs to be fostered by moral education that can
lead us to care more about people beyond our immediate circles of acquaintance. Mirror neurons and
emotional contagion get us started on moral appreciation of the interests of others, but much
socialization is required to improve it. We need moral education to reinforce resistance to the
psychopathic suggestion that self-interest is the highest good.
Hence empathy enhanced by mirror neurons is an important part of moral thinking, but far from the
whole story. When you try to judge whether torturing terrorists can ever be ethical, you can be
influenced by the empathy that you may feel for the victims of both torture and terror, but you need
much more guidance to resolve moral dilemmas involving the pain and suffering of more than one
person. I don't think that evidence about the brain is by itself sufficient to direct us to any one ethical
theory that we ought to adopt, but I will try to show that such evidence puts some constraints on the
evaluation of ethical theories.