This chapter (9) has not attempted to give a full theory of moral psychology, but has pointed to some of the key factors such as mirror neurons and emotional consciousness that are relevant to an understanding of the nature of ethical judgments in the brain.
The revolution that recognizes minds as brains requires us to abandon familiar and valued concepts such as immortality and free will. But ethical ideas about right, wrong, and moral responsibility can survive in altered forms. We can even maintain the old idea of conscience, as long as it is understood as a brain process rather than as a communication from God to soul.
Judgments about right and wrong are instances of emotional consciousness, produced by interactions among multiple brain areas that combine cognitive appraisal with bodily perception. Such moral intuitions might appear to us as direct perceptions of right and wrong, but they are actually very complex brain processes arising from past experiences, both personal and educational.
Moral intuitions by themselves are not evidence that something is right or wrong, and must be evaluated as to whether they reflect objective moral concerns or merely previous biased experience or coercive and arbitrary inculcation by bogus moral authorities. The idea of sin as a free act against a divine being must be abandoned as based on false assumptions about souls and gods. But social emotions such as guilt and shame and the consonant idea of moral responsibility can still be appropriate, if they contribute to the vital needs of all those concerned. Consideration of vital psychological needs such as competence, relatedness, and autonomy provides an explanation and justification for the proposition that the meaning of life is love, work, and play.
According to neural naturalism, moral objectivity does not rest on theological prescriptions, a priori truths, moral universal grammar, or reflective equilibrium. The basis for morality is that people have objective vital needs without which they would be harmed in their ability to function as human beings. Actions have consequences that affect the needs of people; an action is right to the extent that it furthers those needs, and wrong to the extent that it damages them. Moral judgments are inherently emotional in that we feel approval toward what we take to be right and disapproval toward what we take to be wrong. Like emotional experience in general, moral judgments have an element of cognitive appraisal that should include assessment of the consequences of an action for the needs of the people involved. The assessment is not just a cold calculation of costs and benefits, but should include an element of caring about those who are affected. Such caring enlists the physiological aspects of emotions and the functioning of mirror neurons.
Neuroscience is just beginning to use brain scans and other technologies to acquire evidence concerning how brains make ethical judgments: the relevant research dates back only to 2000. A fuller account of ethical brains will have to take into account such fascinating findings as these:
1. Patients with damage to the prefrontal cortex can become flagrantly immoral.
2. Brain scans of people given moral dilemmas to solve reveal different kinds of neural activity that correspond to different moral intuitions depending on whether they engage in personal or impersonal judgments.
3. People can be induced to trust others by nasal sprays of the hormone oxytocin, which affects the brain to increase feelings of affiliation.