What role does science play in what we believe and in how we understand belief? I teach a college course called “Science & the Nature of Evidence,” in which that question is considered at length. Few of my students (few of any of us, perhaps) have thought about it, and they love the chance to take it on.
It matters what we believe and why we believe it. Not just in terms of religious identification, not just for deciding what and how we do things in our day-to-day lives, but also in relation to politics. Take such monumental policy decisions as those regarding health care, or the military. Do we go with what we feel is right and wrong, or—as is the more commonly understood basis for such policy decisions—do we do what will be best for the most people? How about suicide bombers? Why are they willing to lose their lives for their beliefs? Now, we can begin to better understand the mechanisms of such decisions by combining expertise in religion, philosophy, neuroscience, psychology, economics, and even genetics.
A Brave New Interdisciplinary World
When I see a title sitting directly at the nexus of all these disciplines, a title like “The Price of Your Soul: Neural Evidence for the Non-Utilitarian Representation of Sacred Values,” I stop what I’m doing and read it. Such studies are what makes the 21st century so exciting, and scary. The old boundaries of disciplines are blurring and breaking down.
This particular paper was recently published by my Emory colleague Greg Berns, a neuroeconomist, together with scholars in business, psychology, and anthropology. They wanted to know how people think about their ‘sacred values’ and where in their brains this thinking happens.
Neuroscientists know already which parts of the brain are active when we are considering right-wrong decisions, and that this is a different brain region than is active when we are considering cost-benefit, utilitarian decisions. By definition, sacred values are ones we don’t want to give up or trade for (another intriguing and related question not addressed in this study is how a value becomes sacred to an individual). The problem then for the researchers was how to create an situation in which study participants would have their sacred values challenged, in as realistic a way as possible.
Berns et al, after identifying through more classical psychological surveying means what participants’ sacred values were, offered participants increasing amounts of real money to go against those values—or rather to sign a document that said they would go against it. The scientists couldn’t ethically challenge the actual sacred value, but they could challenge the participants’ integrity in relation to that value. Then Berns used functional magnetic resonance imaging (fMRI) to see what parts of the brain were active during these transactions.
The Sacred is Not About Utility
For the research subjects’ sacred values, the ones they wouldn’t give up on for any amount of money (they could ‘auction off their value’ for up to $100), what lit up in the brain were areas known to be involved in right-wrong decisions, not in cost-benefit/utilitarian parts of the brain. That is, we naturally go to right-wrong thinking in making sacred value decisions.
Berns et al. found this held true for sacred values dictated by law as well as those that are not, like belief in God. In addition to right-wrong thinking, these same parts of the brain have been implicated in language rule and other rule retrieval. The researchers also found that sacred values thinking activated the amygdala, the emotional center of the brain. Perhaps not too much surprise there.
Why does any of this matter? It goes back to why we believe what we do; the military strategists, the politicians. People make decisions—and act—based on their beliefs. The more we understand about the mechanism of why people believe what they do and how they act on it, the more we understand about people, period. In fact, Berns et al. showed that folks with stronger activation of the right-wrong parts of the brain during a sacred value challenge are significantly more likely to belong to organizations: religious, humanitarian, political, or charitable. Perhaps such people are more likely to act on their beliefs.
There are also implications here, I would argue, for education, and more work should be done explicitly in this area of neuroeducation. The findings in Berns’ paper are consistent a central principle I’ve learned in teaching the Science & the Nature of Evidence class, in which we discuss students’ beliefs intensively—and just in general in over two decades of teaching: teachers don’t change students’ beliefs.
What good teaching does is effectively integrate what’s being taught into students’ preexisting belief structures. Good teaching both appeals to and challenges students’ knowledge base and values—sacred or not.