Archive for category Critical Thinking
It’s great to see the enthusiasm with which cognitive biases are discussed on social media. Occasionally, though, enthusiasm gets in the way of accuracy, and an explanation takes hold even though it isn’t quite right.
The latest example comes via the (normally excellent) I Fucking Hate Pseudoscience site and Facebook community. The post “Understanding Bias- What colour is this truck?” starts off well, pointing out that our judgement of the likelihood of getting attacked by a shark is biased by a number of factors: sensationalism in the media, the fact that the media are global rather than local, and the individual’s unconscious assumption that global information reflects local risks.
The sentence “The mental shortcut we use by making this assumption is an example of a heuristic.” is ambiguous, because the previous paragraph mentioned a bunch of processes, not all of which count as heuristics, but I’m happy to give the benefit of the doubt so far.
In the next paragraph, I start to question if the article is actually about heuristics and biases. This is where the picture of a truck comes in: Read the rest of this entry »
It’s not exactly known as an educational site, but Cracked.com often comes up with engaging and well-written articles on critical thinking and psychology, with pointers to the underlying scientific research. I was pleased to see this latest article on “5 logical fallacies that make you wrong more often than you think”.
The five “fallacies” they explain (really they mean biases rather than logical fallacies) are
- Confirmation bias
- Fundamental attribution error
- Neglect of probability
- The trust gap
- Argumentative theory of reasoning
The more I learn about critical thinking, the more I realise “logical fallacy” is a useless concept, and the concept of “bias” is the one that does the work, but more about that on another day.
In this short video, the initiators of the ten 23 campaign against homeopathy create some 30C vodka: in other words, vodka that has been diluted in water until not a single atom of the original remains. Should it be available on the National Health Service?
Though lots has been written about Bayes, I wanted to convey to a lay audience what he achieved and why it’s so important now. Here is an attempt at a set of “footnotes” for anyone who wants to follow up: Read the rest of this entry »
Continuing my coverage of the blogs that discuss confirmation bias, I’m pleased to see Scott H Young’s post “Why I’ve Decided to Be Wrong More Often”, in which he discusses what you can do to be less biased in life. He recommends deliberately seeking out contrary opinions.
I also leave myself open to being wrong, and seek out ideas that disagree with me. I try to read books from authors with whom I disagree with. I pay most attention to commenters who argue against an article I’ve written.
More central to his strategy is to accept that mistakes are inevitable, to prepare for them and to learn from them. This involves being open to the possibility of being wrong on quite fundamental things, including political or cultural beliefs that are tied up with personal identity. It also involves self-forgiveness: accept that you were wrong, learn and move on.
My goal is to be wrong about one big idea in my life, business or philosophy every month. I know if I’m not having big moments of wrongness at this frequency, it’s almost certainly because I’m ignoring other perspectives, not because I’m infallible.
There’s an analogy with being a venture capitalist: if all of the projects they back turn out to be viable, it suggests they would haven’t taken enough risk. Venture capitalists will aim to have a certain proportion of failures among the start-ups they back, though obviously at the time they don’t know which will succeed and which will fail.
A year ago, the Wikipedia article on Confirmation bias was in a poor state. Whoever had written it was well-intentioned but they’d been working from a small number of sources and perhaps hadn’t seen the big picture. I started a substantial rewrite. The community gave me a lot of help to make the text accessible, and a couple of weeks ago it reached the highest quality standard on Wikipedia: Featured Article. (“Confirmation bias” as it was on 10th August 2009 vs “Confirmation bias” now).
This week I learned it has been chosen as “Today’s Featured Article” for tomorrow (Friday 23rd July). A one-paragraph summary will appear on the front page, where it can be seen by around four million users. Around sixty thousand will click through to the article itself. It will also be seen through the dozen or so sites that mirror Wikipedia. With this new prominence, it is more likely the article will be translated into other languages (extracts have already been translated into Spanish and Catalan). The are also other delivery platforms: I’m already planning a spoken version of the article, but won’t have time to do it before tomorrow. Being naturally the first Google hit for “confirmation bias”, the article has a high prominence (getting nearly a thousand hits per day) and it is regularly recommended and discussed on blogs, online communities such as Reddit.
So, it’s fascinating to watch the ripple effect of this article to which I’ve contributed. Confirmation bias is something you’d definitely hear about if you do certain courses within a psychology degree, but it’s not exactly the sort of topic that you would expect to see stories about in the newspaper or the evening news. Hence it’s significant that perhaps millions of people will hear about it through this article. To be honest, this provokes mixed feelings. Read the rest of this entry »
Bad Science author Ben Goldacre has long wondered aloud why science stories in the media have to be purged of the crucial terminology such as “double-blind” and “randomisation” when the sports pages and financial pages are full of terminology that’s particular to those audiences. Blogger Mike Knell takes the idea a bit further in a well-observed satire.
On 14th February this year, British newspaper the Daily Mail inadvertently published a critical thinking test in its paper and web site. Some of the people who failed the test have been loudly proclaiming it on blogs, opinion columns and comments in forums such as The Guardian‘s Comment is Free. Apparently they don’t know they failed.
The article reported an interview with climate scientist Prof Phil Jones. The headline tells us that Prof. Jones has admitted that “there has been no global warming since 1995.” This article has been used as a “citation” by global warming deniers to show that there’s no scientific consensus on the reality of global warming. To a critical thinker, this should be very suspect. Let’s go through the mistakes one by one. Read the rest of this entry »
Confirmation bias is the bias to seek for, interpret and remember information in ways that confirm our existing beliefs rather than genuinely test them. In general, it’s an irrational preference for information that matches our expectations. This is one of the first biases I learned about, but recently I’ve been reading up on it in a more systematic way. I’m putting my notes direct into Wikipedia rather than improve my own site.
In what I’ve learned, there’s a massive irony that I’m surprised isn’t commented on. The term “confirmation bias” comes from the original pair of experiments from the 1970s by Peter C. Wason. Since and because of them, it has become widely accepted that subjects seek to confirm their working hypotheses rather than subject them to falsification.
However, those experiments didn’t prove the existence of a confirmation bias. There were logical errors in the interpretation of the results, pointed out especially in a 1987 Psychological Review paper by Joshua Klayman and Young-Won Ha which is one of my all-time favourite academic papers (see the Wikipedia article for refs). Subsequent research has found genuine confirmation biases, but they’ve turned out to be specific to particular situations, rather than ubiquitous. When testing a hypothesis, people often seem to prefer a genuinely diagnostic strategy.
Despite this critique, there is still a lot of psychological writing that takes the Wason experiments as proving the reality of confirmation bias. Even Sutherland does so in “Irrationality”, his outstanding paperback round-up of bias research.
So why were these experiments accepted so easily as proof when, for a long time, the evidence was inadequate? Because it fit with expectations built up from informal observation: in other words, a clear case of confirmation bias.
Joyful news today on the BBC about a successful campaign for the charity Sense About Science. They asked the World Health Organisation to comment on the use of homeopathic treatment for diseases like HIV, malaria, TB and infant diarrhoea, and various WHO authorities have responded, stating in very clear terms that these conditions need to be treated with actual medicine and actual evidence. SAS have passed on these WHO statements in an open letter to the world’s health ministers (PDF link). Using “remedies” without any active ingredient to “treat” these horrible diseases, when effective alternatives are available, is an obscenity.
It’ll be interesting to see how much popular attention this gets (it was second-most-read on the BBC News site this morning): is the tide finally turning against alternative medicine?