Moral selectivity means ill-gotten gains are worth less in the brain
by Thom Bush
on 21 June 2017
Dishonestly, honestly

When it comes to developing good moral values, if you’ve ever been cheated, maligned or abused, you will no doubt value honesty and integrity at tad more than if you had’t. . . . . .

 But there is a better way to acquire such values, than empirically! Moral fortitude is at the heart of therapeutic practice. To align the values from within, to those from the without and vis versa becomes the essence of life. It creates mental congruency as opposed to the cognitive dissonance that arises from living a falsehood. It seems that the good that resides within us all, needs the parallel of an aligned life. That said, circumstances make us do things we otherwise wouldn't. However, getting away with it once can make us develop it to the level of habit, this is the source of corruption.

Hypnosis is, without doubt, one of the most empowering ways we can break unwanted habits. But it is also a rich resource of developing good habits, values and beliefs. With a brain that is morally overhauled, life can only get better!

The research:
The brain responds less to money gained from immoral actions than money earned decently, reveals a new UCL-led study. Research, published in Nature Neuroscience and funded by Wellcome, helps explain why most people are reluctant to seek illicit gains by identifying a neural process that dampens the appeal of profiting at other people's expense. "When we make decisions, a network of brain regions calculates how valuable our options are," explained lead author Dr Molly Crockett of the University of Oxford, who carried out the research while based at the UCL Wellcome Centre for Neuroimaging. "Ill-gotten gains evoke weaker responses in this network, which may explain why most people would rather not profit from harming others. Our results suggest the money just isn't as appealing.

"The research team scanned volunteers' brains as they decided whether to anonymously inflict pain on themselves or strangers in exchange for money. The study builds on previous research by the same team that showed people dislike harming others more than harming themselves. This behaviour was seen again in this study, with most people more willing to harm themselves than others for profit.

The study involved 28 pairs of participants who were anonymously paired and randomly assigned to be either the 'decider' or the 'receiver'. Deciders picked between different amounts of money for different numbers of electric shocks. Half the decisions related to shocks for themselves and half to shocks for the receiver, but in all cases the deciders would get the money. The shocks were matched to each recipient's pain threshold to be mildly painful but tolerable. The deciders were in an fMRI brain scanner.

As they made their decisions, a brain network including the striatum was activated, as it has been shown in previous studies to be key to value computation. As they decided between more profitable options or those with fewer shocks, this brain network signalled how beneficial each option was. The network responded less to money gained from shocking others, compared with money gained from shocking oneself -- but only in those people who behaved morally. Meanwhile, the lateral prefrontal cortex (LPFC), a brain region involved in making moral judgments, was most active in trials where inflicting pain yielded minimal profit. In a follow-up study, participants made moral judgements about decisions to harm others for profit, and considered those same trials to be the most blameworthy. Taken together, the findings suggest the LPFC was assessing blame. When people refused to profit from harming others, this region was communicating with the striatum, suggesting that neural representations of moral rules might disrupt the value of ill-gotten gains encoded in the striatum.

"Our findings suggest the brain internalizes the moral judgments of others, simulating how much others might blame us for potential wrongdoing, even when we know our actions are anonymous," Dr Crockett said. Senior author Professor Ray Dolan (UCL Max Planck Centre for Computational Psychiatry and Ageing Research) said: "What we have shown here is how values that guide our decisions respond flexibly to moral consequences. An important goal for future research is understanding when and how this circuitry is disturbed in contexts such as antisocial behaviour."

Story Source:  Materials provided by University College London. Note: Content may be edited for style and length.

Journal Reference:  1. Molly J Crockett, Jenifer Z Siegel, Zeb Kurth-Nelson, Peter Dayan, Raymond J Dolan. Moral transgressions corrupt neural representations of value. Nature Neuroscience, 2017; DOI: 10.1038/nn.4557