By analyzing brain activity, they have uncovered the process the brain uses to distinguish between moral choices.
By: Jeff Milgram
Imagine this dilemma:
A train is hurtling down the track and will kill five people unless you switch it onto a spur where it will kill only one person. What do you do?
Here’s another dilemma: There is a very heavy man standing near the track. If you push him onto the track he will stop the train. Only that one man will die and the five people will be saved.
Most people make a distinction between the two scenarios and believe it’s morally appropriate to flip a switch to save lives, but not to push someone in front of a moving train to save lives.
In a groundbreaking study, Princeton researchers have begun to discover the nexus between philosophy and neurosciences.
By using a functional magnetic resonance imaging machine, known as a fMRI, they have been able to analyze brain activity in people who were asked to make uncomfortable moral choices.
In so doing, they have uncovered the process the brain uses to distinguish between moral choices.
The study, published in Science magazine in September, was the subject of a program, "Moral Psychology and Bioethics," held Wednesday at the Woodrow Wilson School of Public and International Affairs. The program was moderated by Princeton bioethicist Peter Singer, who said, "Essentially new ground is being broken at Princeton."
The study showed how neuroscience can reveal the underpinnings of basic human behavior, said Joshua Greene, a philosophy department graduate student who conducted the study with colleagues from the university’s Center for the Study of Brain, Mind and Behavior.
What the Princeton scientists found was that people use an emotional response to influence moral principles when faced with certain kinds of moral dilemmas.
"The role of emotion can be surprising," said Mr. Greene.
And, he said, people may take sides on bioethical controversies such as abortion, stem-cell research and end-of-life issues, not because of reasoned moral principles, but simply because they’re "yucky."
"Yuck plays a big role in bioethics," Mr. Greene said.
Jonathan Cohen, director of the Center for the Study of Brain, Mind and Behavior, said the data from the study could be used to create a "national wellness index" and explain why different cultures have different senses of morality.
"If we can quantify emotional response … that would be of great value," Dr. Cohen said.
The research team asked a battery of 60 questions to two groups of nine people. The questions were asked while the subjects were undergoing fMRI scanning.
Questions were divided into personal and nonpersonal categories. Personal moral dilemmas would include pulling the switch on the rail tracks and throwing the man in front of the train. Nonpersonal ethical questions would include keeping money from a lost wallet. The team also asked questions that did not have a moral ingredient.
The scanning showed a greater level of activity in areas of the brain associated with emotions during the questions involving personal moral issues, and less activity in parts of the brain associated with memory.
The response time also was longer during questions involving personal moral dilemmas.