Wednesday, September 27, 2017

Understanding the Influential Mind

{ http://ift.tt/2xLh5GC


It is tempting to say that we live in the “age of influence,” though of course every age is an “age of influence”—ours has just been super-charged by social media. Tali Sharot, the founder and director of the Affective Brain Lab at University College London, set out to understand the neuroscience behind influence. Why do some things move our opinions, while others leave us cold? Her book, “The Influential Mind,” is an exploration of these and other puzzles. She answered questions from Mind Matters editor Gareth Cook.

Why is it that providing evidence does not always prove persuasive to people?
Evidence tends to be very persuasive when it already fits your world view, but less so when it does not agree with your pre-conceived notion. This is because data is assessed in light of what we already believe (what cognitive scientists call “priors”). And in fact that is a reasonable approach. For example, if I were to tell you I had observed a pink elephant flying in the sky you would assume I was lying or delusional – as you should. On average, when you encounter a piece of data that contradicts what you believe with confidence about the world, that piece of data is in fact wrong.  So the further away the new evidence is from your belief the less likely it is to alter it. This approach to changing our beliefs makes sense. However, a side effect of this process is that strong opinions are very difficult to change, even if they are wrong.

There is, however, one situation in which people embrace new information even if it contradicts what they already believe—when the new information is exactly what you want to hear.

For example, back in August 2016 Ryan McKay and Ben Tappin asked 900 US citizens to indicate who they thought would win the presidential elections. The researchers also asked people who they wanted to win – half wanted Trump to win. The majority of both Trump supporters and Clinton supporters believed Clinton would win. Then, new pools were presented predicting a Trump victory and everyone was asked again to indicate who they believed would win. Did the new pools alter expectation of what was to unfold? They did. But mostly it affected the Trump supporters – who were elated by the new information. The Clinton supporters altered their prediction only a little bit, many choosing to ignore the data altogether.

Our immediate reaction when we receive information we do not want to hear – whether it is related to politics, or a doctor warning us of the dangers of drinking, or negative feedback about ourselves—is to try and rationalize it away, discount or ignore all together. 

How can we break through this resistance?
When it comes to altering how you respond to information, awareness can help. When you find yourself dismissing negative feedback or convincing yourself that your critics do not know what they are talking about, take a pause and reevaluate. Could there be merit in the negative information, and can you use it to improve?

When it comes to getting your message across to others, consider if you can reframe the information you provide to highlight the possibility of progress not decline. To be clear, this does not mean sugar-coating what you have to say. If, for example, you need to critique someone’s work, do not soften the critique – convey the problem clearly. However, the existing problem can be communicated either in terms of what needs to be corrected in order to produce good work or in terms of incompetence: the first approach will induce less resistant and thus be more effective.

It must be interesting to be involved in this kind of work a time when there is so much concern about the state of our public debate. I wonder what perspective you have on it?
I am concerned about the negative effects of social media. All we know about human biases—conformity, over-confidence and so on – suggests that the abundance of information and opinions on the web will result in misinformation, false belies and polarization. And we already see this happening. We can now find information online to support any view or opinion we wish, and that makes us more confident in our opinions and more resistant to change.

In one study Andreas Kappess and I, together with others, asked volunteers to come into the lab in pairs and simultaneously scanned their brains in two MRI scanners while they were making decisions together. We found that when a duo agreed each person’s brain activity reflected precise encoding of the other’s opinions. As a consequence when two people agreed their confidence in their decision grew significantly. However, when they disagreed, their brain became less sensitive to the information presented by the other person. In fact it looked like the brain, metaphorically speaking, was shutting down. This is what is happening online – people respond to others that agree with them, dismiss those who do not (sometimes viscously), and the result is escalation.

Do you have any suggestions for what we can do to improve things?
In general, to protect us online from things like fake news, trolling, and offensive messages etc. much more regulation is needed. Laws and regulations need to catch up with our times and penalties need to be put in place to deter people from such behavior. Today it is legally permitted, for example, to use someone else’s photo on social media as your profile photo and then post racists comments, creating the impression that the person photographed is behind those comments. It is legally permitted to leave offensive sexual comments in any discussion forum and social media site. Websites need to take responsibility and make changes. 

As a side note – science suggests that the threat of a penalty is especially effective in deterring people from acting (i.e. such as deterring form posting an offensive comment), while rewards are better for motivating action (i.e. motivating people to post a comment). I talk at length about the science behind this distinction in chapter 3 of my book.

In terms of controlling your own reactions, it is a good practice to slow down when using platforms like Twitter to consciously reflect on our reactions. Science has shown that waiting just a couple of minutes before making judgments reduce the likelihood that they will be based solely on instinct.



{ All Hypnosis Feeds

via Scientific American: Mind & Brain http://ift.tt/n8vNiX

September 27, 2017 at 12:33PM

No comments:

Post a Comment