This morning I listened (by chance rather than intention) to a BBC Radio 4 programme on the post-truth world and thought it insightful and marvellous. “Post truth” is the word of the year in the UK and Germany (and probably many other countries), but is it really new? Politicians have always lied, and we all know that the world is not rational and voting is based more on guts than brain–as, indeed, is much of our “thinking.” The core argument of the programme was that there’s nothing very new about post-truth but that the past year has made us think about it and begin to understand it more.
You can access the podcast here http://www.bbc.co.uk/programmes/p04m7zrs (or you can at least if you’re in the UK), and I strongly recommend listening to it. But in case you can’t or don’t have the time, I’ve tried mainly for my own benefit to capture what I can remember from the programme (and I wish that there was a written version).
- Narrative trumps facts but only temporarily. Once facts cause the narrative– for example, that presented by the Brexiters, Donald Trump, Theresa May, or Jeremy Corbyn–to collapse, things can become very dark.
- We believe whom we trust regardless of the accuracy of their facts.
- Even the most numerate of us interpret data in the light of our values. (I see that with the battles over statins and the weekend effect in medicine.)
- We trust ourselves the most–we are the leading experts on our own lives. But we should trust ourselves less–for at least 10 minutes a day.
- We all live in “bubbles,” and social media (where we select our friends and whom we follow) strengthen those bubbles. Psychogeography shows that people with similar beliefs are concentrated–either because they move to be with like-minded people or because the people around is influence what we believe (it’s probably both). This bubble effect is probably stronger than in the past.
- We should try and listen every day to people with different values from us.
- Trump spoke to a “deeper truth” in people.
- Corrections of wrong facts that we believe causes us to have greater belief in those wrong facts (“the backfire effect”)
- A deeply conservative Southern politician lost his seat because he came to believe in climate change. (interestingly it was his son, who was equally conservative on most issues, who caused him to change.) It was “rational” for people not to vote for him because believing in climate change is a badge that shows he is “one of us.” By changing his beliefs he had betrayed his people.
- We easily rationalise away facts that don’t fit with our values.
- Facts and truth are not the same thing. (We’ve long known that.)
- People vote on values and trust not facts. Facts are irrelevant. (We’ve long known that too.)
- We are likely to carry on believing a story (even when told it’s not true) until we are presented with a story that fits with our values and we like better.
- To be understood by an audience that knows only Spanish you have to speak in Spanish. It’s the same when speaking to a conservative (or liberal) audience.