Archive for category Critical Thinking
It’s great to see the enthusiasm with which cognitive biases are discussed on social media. Occasionally, though, enthusiasm gets in the way of accuracy, and an explanation takes hold even though it isn’t quite right.
The latest example comes via the (normally excellent) I Fucking Hate Pseudoscience site and Facebook community. The post “Understanding Bias- What colour is this truck?” starts off well, pointing out that our judgement of the likelihood of getting attacked by a shark is biased by a number of factors: sensationalism in the media, the fact that the media are global rather than local, and the individual’s unconscious assumption that global information reflects local risks.
The sentence “The mental shortcut we use by making this assumption is an example of a heuristic.” is ambiguous, because the previous paragraph mentioned a bunch of processes, not all of which count as heuristics, but I’m happy to give the benefit of the doubt so far.
In the next paragraph, I start to question if the article is actually about heuristics and biases. This is where the picture of a truck comes in: Read the rest of this entry »
It’s not exactly known as an educational site, but Cracked.com often comes up with engaging and well-written articles on critical thinking and psychology, with pointers to the underlying scientific research. I was pleased to see this latest article on “5 logical fallacies that make you wrong more often than you think”.
The five “fallacies” they explain (really they mean biases rather than logical fallacies) are
- Confirmation bias
- Fundamental attribution error
- Neglect of probability
- The trust gap
- Argumentative theory of reasoning
The more I learn about critical thinking, the more I realise “logical fallacy” is a useless concept, and the concept of “bias” is the one that does the work, but more about that on another day.
In this short video, the initiators of the ten 23 campaign against homeopathy create some 30C vodka: in other words, vodka that has been diluted in water until not a single atom of the original remains. Should it be available on the National Health Service?
Though lots has been written about Bayes, I wanted to convey to a lay audience what he achieved and why it’s so important now. Here is an attempt at a set of “footnotes” for anyone who wants to follow up: Read the rest of this entry »
Continuing my coverage of the blogs that discuss confirmation bias, I’m pleased to see Scott H Young’s post “Why I’ve Decided to Be Wrong More Often”, in which he discusses what you can do to be less biased in life. He recommends deliberately seeking out contrary opinions.
I also leave myself open to being wrong, and seek out ideas that disagree with me. I try to read books from authors with whom I disagree with. I pay most attention to commenters who argue against an article I’ve written.
More central to his strategy is to accept that mistakes are inevitable, to prepare for them and to learn from them. This involves being open to the possibility of being wrong on quite fundamental things, including political or cultural beliefs that are tied up with personal identity. It also involves self-forgiveness: accept that you were wrong, learn and move on.
My goal is to be wrong about one big idea in my life, business or philosophy every month. I know if I’m not having big moments of wrongness at this frequency, it’s almost certainly because I’m ignoring other perspectives, not because I’m infallible.
There’s an analogy with being a venture capitalist: if all of the projects they back turn out to be viable, it suggests they would haven’t taken enough risk. Venture capitalists will aim to have a certain proportion of failures among the start-ups they back, though obviously at the time they don’t know which will succeed and which will fail.
A year ago, the Wikipedia article on Confirmation bias was in a poor state. Whoever had written it was well-intentioned but they’d been working from a small number of sources and perhaps hadn’t seen the big picture. I started a substantial rewrite. The community gave me a lot of help to make the text accessible, and a couple of weeks ago it reached the highest quality standard on Wikipedia: Featured Article. (“Confirmation bias” as it was on 10th August 2009 vs “Confirmation bias” now).
This week I learned it has been chosen as “Today’s Featured Article” for tomorrow (Friday 23rd July). A one-paragraph summary will appear on the front page, where it can be seen by around four million users. Around sixty thousand will click through to the article itself. It will also be seen through the dozen or so sites that mirror Wikipedia. With this new prominence, it is more likely the article will be translated into other languages (extracts have already been translated into Spanish and Catalan). The are also other delivery platforms: I’m already planning a spoken version of the article, but won’t have time to do it before tomorrow. Being naturally the first Google hit for “confirmation bias”, the article has a high prominence (getting nearly a thousand hits per day) and it is regularly recommended and discussed on blogs, online communities such as Reddit.
So, it’s fascinating to watch the ripple effect of this article to which I’ve contributed. Confirmation bias is something you’d definitely hear about if you do certain courses within a psychology degree, but it’s not exactly the sort of topic that you would expect to see stories about in the newspaper or the evening news. Hence it’s significant that perhaps millions of people will hear about it through this article. To be honest, this provokes mixed feelings. Read the rest of this entry »
Bad Science author Ben Goldacre has long wondered aloud why science stories in the media have to be purged of the crucial terminology such as “double-blind” and “randomisation” when the sports pages and financial pages are full of terminology that’s particular to those audiences. Blogger Mike Knell takes the idea a bit further in a well-observed satire.