Some things, course, obviously either true or not true. “The atomic weight of carbon is 14” is not the sort of fact which admits of very much argument. Neither is it a fact which is very interesting to most people. Most people are much more interested in questions which admit of a great deal of argument. Donald Trump is going to be a disaster for the United States. Global warming is an existential threat. The European Union does more harm than good. Hmm. Maybe, maybe not.
I have always thought that a version of Occam’s razor is quite a good place to start. It is the general notion that, where there are competing explanations of something, the simplest is the more probable. Or as Bertrand Russell put it:
Whenever possible, substitute constructions out of known entities for inferences to unknown entities
One application of all of this is the cui bono rule, which is often attributed to Cicero, although Cicero himself ascribed to Lucius Cassius. A court should ask itself who is likely to benefit from a particular crime; that person may well be the criminal.
It is not an infallible rule, of course. Nor is my variant, which goes something like this:
Where there are competing versions of the truth, the one most infected by cognitive bias is the one less likely to be true.
Putting flesh on the bones of Occam’s razor, the simplest explanation as to why someone thinks something is true is often that he or she wants to believe that it is true.
Or re-jigging Cicero, the entity that benefits from a particular explanation might well be the left side of the brain, not the right.
The trouble with this is that the science of cognitive bias has become really rather complex. Dozens of cognitive biases have been identified. There is rather a good list of them on the Better Humans website, the Cognitive bias cheat sheet.
There are people who have done a considerable amount of research is to whether religious beliefs and so forth are to be explained by one or more of these cognitive biases. Unless you happen to be religious, you are likely to find it quite easy to accept that religious belief is the result of cognitive bias. Lots of people very much want to believe in the existence of a deity, and the smarter they are, the better they will be at finding ways of rationalising that belief.
My strong suspicion is that belief in the risk of catastrophic global warming, fear of Brexit, loathing of Donald Trump and all sorts of other things are most simply explained by these cognitive biases, but the task of explaining the mechanisms of precisely why those cognitive biases lead so many people today into these unjustified fears is far from straightforward.
So I will just say that I think the 2017 is not going be a bad year. And if enough people repeat this, we might just get a bit of confirmation bias rolling!
  Cassius ille quem populus Romanus verissimum et sapientissimum iudicem putabat identidem in causis quaerere solebat “cui bono” fuisset.
The famous Lucius Cassius, whom the Roman people used to regard as a very honest and wise judge, was in the habit of asking, time and again, “To whose benefit?”
 Availability heuristic, Attentional bias, Illusory truth effect, Mere exposure effect, Context effect, Cue-dependent forgetting, Mood-congruent memory bias, Frequency illusion, Baader-Meinhof Phenomenon, Empathy gap, Omission bias, Base rate fallacy
Anchoring, Contrast effect, Focusing effect, Money illusion, Framing effect, Weber–Fechner law, Conservatism, Distinction bias
Confirmation bias, Congruence bias, Post-purchase rationalization, Choice-supportive bias, Selective perception, Observer-expectancy effect, Experimenter’s bias, Observer effect, Expectation bias, Ostrich effect, Subjective validation, Continued influence effect, Semmelweis reflex
Confabulation, Clustering illusion, Insensitivity to sample size, Neglect of probability, Anecdotal fallacy, Illusion of validity, Masked man fallacy, Recency illusion, Gambler’s fallacy, Hot-hand fallacy, Illusory correlation, Pareidolia, Anthropomorphism
Group attribution error, Ultimate attribution error, Stereotyping, Essentialism, Functional fixedness, Moral credential effect, Just-world hypothesis, Argument from fallacy, Authority bias, Automation bias, Bandwagon effect, Placebo effect
Hindsight bias, Outcome bias, Moral luck, Declinism, Telescoping effect, Rosy retrospection, Impact bias, Pessimism bias, Planning fallacy, Time-saving bias, Pro-innovation bias, Projection bias, Restraint bias, Self-consistency bias
Overconfidence effect, Egocentric bias, Optimism bias, Social desirability bias, Third-person effect, Forer effect, Barnum effect, Illusion of control, False consensus effect, Dunning-Kruger effect, Hard-easy effect, Illusory superiority, Lake Wobegone effect, Self-serving bias, Actor-observer bias, Fundamental attribution error, Defensive attribution hypothesis, Trait ascription bias, Effort justification, Risk compensation, Peltzman effect
Sunk cost fallacy, Irrational escalation, Escalation of commitment, Loss aversion, IKEA effect, Processing difficulty effect, Generation effect, Zero-risk bias, Disposition effect, Unit bias, Pseudocertainty effect, Endowment effect, Backfire effect
Peak–end rule, Leveling and sharpening, Misinformation effect, Duration neglect, Serial recall effect, List-length effect, Modality effect, Memory inhibition, Part-list cueing effect, Primacy effect, Recency effect, Serial position effect, Suffix effect