8 Oct 2020

Some warning signs of wishful thinking

Sorting out whether we believe something for a good reason or because we want it to be true (or, oddly, because we fear it to be true) is hard. Even those I consider very rational are not always good at this (including myself), and some that are most partisan use this trait ruthlessly. However, there are some (fallible) signals that can help alert oneself of such wishful thinking, such as the following.

  1. It sanctions me doing something I want to do (drive in the car when I could have cycled, take an international holiday, eat a huge slice of over-rich chocolate cake etc.)
  2. It helps me criticise/attack/despise something I already dislike/think is bad (a government, a politician, a law, a restriction, the green movement, Capitalism, foreign aid, Brexit supporters etc.)
  3. The formulation of the belief shifts over time (from "smoking is not harmful" to "there is no evidence it is" to "it is harmful but only one of a complex of factors", from denying the earth's temperature is rising to denying it is due to humans to denying it is worth stopping etc.)
  4. The belief is constructed so that it is hard to disprove (conspiracy theories are often like this, but so too are many political claims, e.g.. some of the claims of Brexit or Remain camps, the benefit of homoeopathic cures, crystals)
  5. It sanctions me not doing something I do not want to do (find a job if I am lazy, get exercise if I am unhealthy, change my mind if I do not want to, admit to being wrong if this is embarrassing, avoid doing my expenses, wear a mask, wear a cycle helmet etc.)
  6. The use of obviously weak arguments to support the belief (it does not cost much anyway, I was going that way anyway, it is my right to do it, it won't harm anyone, Dominic Cummings did it so why not me, anything X says is rubbish, etc.)
  7. The invention of new supporting arguments only formulated when old ones are revealed to be weak or wrong (in economics - ok we know people are not rational but collectively they behave as if they are, the climate is warming due to the sunspot cycle etc.)
  8. The belief signals membership of a group I wish to belong to (holocaust denial, global warming will result in the extinction of all humans, greed is good etc.)
  9. Support of the belief is based on the persecution or the weakness of the arguments by those opposing it (Big Pharma would want this, they say there is a magic money tree, various Nationalist claims, the Government does not want you to know this etc.)
  10. It is involved in a highly political or personalised argument (Brexit, HCQ, Republican/Democrat, Immigration, lockdown, taxes etc.)
  11. There is no positive evidence for the belief - rather a (perceived) lack of negative evidence (herbal remedies, superstitions, free will/denial of free will, chemicals in the drinking water are affecting me, self-supporting arguments such as everyone is lying etc.)
  12. Hype, ridicule or insults are used to defend the belief (if you follow that line you would not be able to do anything, "socialist" in the US, only a stupid person would believe that, that is what they want you to believe, Trump is the best/worst president ever etc.)
  13. All my friends/group/kind believe it - though this often not an explicit/conscious reason (shaving, our group is superior to others/outsiders, our technique is better, British humour is unique and similar national myths, any diet that involves harm to animals is wrong etc.)
  14. It is too interesting - too surprising, funny, odd etc. False beliefs are not constrained by boring facts and thus can be far more engaging (internet memes, the earth is hollow, you too can be thin with this simple trick etc.)
  15. It is comforting or otherwise gives me status (Earth is the centre of the universe, Humans are the pinnacle of creation/evolution, simple theories are more likely to be true, your country is special/unique etc.)
  16. New phrases/words are used that are invented by believers - because this indicates this is more a group membership thing than a matter of truth (sheeple, Remoaners, follow the crumbs etc.)
  17. It can be expressed in a very few words and has lots of CAPITALS and exclamation marks!!(political slogans, ads, tweets, etc.)
  18. Support is mostly via a list of personal endorsements (these are easy to collect at a trivial level and very hard to check)
  19. When critiqued the response is not to engage with the argument but to reply with something else.  (The warning sign here is a lack of interest in the the basis for the belief - the belief comes first)
  20. It is contrary to general opinion since this makes the believers special and different (and hence gives status) - the narrative of the prophet in the wilderness or one man against the system (and, yes, in common narratives - e.g. films - this person is almost invariably male).


I NOT saying that these are infallible signs of a wrong belief (some of these hold for some truths), and I am NOT saying one shouldn't do these things (e.g. critiquing something you feel is wrong might well be a good thing to do). It just each of these should give one pause to question the corresponding belief a bit more than one might otherwise do. If one's reflection on the grounds for the belief indicate they are not so solid, then think of independent ways/evidence to check that belief or just shift position to a less certain belief, noting the doubt.

Clearly, I should populate this list with references to evidence (but who would read that anyway ;-), which I may get around to. However, I will update this list when I find more suggestions to add.

Also one should distinguish these warning signs from aspects that are merely not entirely reliable indications of truth, including the following (nothing outside formal systems can be 100% proved).

  • That a clever person says it (like me 8-)
  • That there is a debate or many views about this (that there are climate deniers or anti-vaxxers in TV debates does not mean they have strong arguments or reliable evidence)
  • That it is repeated by many users or on many websites (that Biden used an earpiece in the debate, that the Oklahoma bomber was an Islamic refugee etc.)
  • There are non-peer-reviewed/non-rigourous papers that claim this (look at the pre-print literature on COVID19)
  • I find it a useful/insightful way of thinking about things (if something can help one get to true insights it might equally help one to misleading ones)
  • It is plausible given what I know (plausibility makes it a hypothesis not a fact)
  • It fits with all my other knowledge/beliefs (but those might also be wrong)
  • It was taught to me at school (I was taught continental drift was false)
  • I read it in a book/newspaper (some authors, editors, journalists and publishers care to restrict  publication to statements that are well supported, but many do not)
  • An example of this is documented (examples are good starting points, but not enough to support generalisation)
  • It is complicated or technical (just because it is impressive is not enough to make it true, but it does indicate more effort has been put in to it).
  • That it is in the interest of some institution or block for you to disbelieve it (one can't ignore power relations when assessing truth, but the effect of power is complex)