Archive for category Philosophy
It’s great to see the enthusiasm with which cognitive biases are discussed on social media. Occasionally, though, enthusiasm gets in the way of accuracy, and an explanation takes hold even though it isn’t quite right.
The latest example comes via the (normally excellent) I Fucking Hate Pseudoscience site and Facebook community. The post “Understanding Bias- What colour is this truck?” starts off well, pointing out that our judgement of the likelihood of getting attacked by a shark is biased by a number of factors: sensationalism in the media, the fact that the media are global rather than local, and the individual’s unconscious assumption that global information reflects local risks.
The sentence “The mental shortcut we use by making this assumption is an example of a heuristic.” is ambiguous, because the previous paragraph mentioned a bunch of processes, not all of which count as heuristics, but I’m happy to give the benefit of the doubt so far.
In the next paragraph, I start to question if the article is actually about heuristics and biases. This is where the picture of a truck comes in: Read the rest of this entry »
Though lots has been written about Bayes, I wanted to convey to a lay audience what he achieved and why it’s so important now. Here is an attempt at a set of “footnotes” for anyone who wants to follow up: Read the rest of this entry »
Here is some draft text for a statement of my intellectual interests. They are bold, sometimes speculative, conclusions to draw people into the subject. Points 1 and 3 are argued in my thesis (which I’m putting online right now). Point 2 about egotism is something that I’ve always been interested in but haven’t written much about yet. Carol Tavris and Elliot Aronson write perhaps the definitive book about the topic in “Mistakes Were Made (but not by me)”. Point 4 is the most tentative. It touches on political philosophy, where I’m not an expert, and don’t plan to assemble a detailed argument. I’ve decided to float the idea, though, and see how it shapes my or others’ thinking.
1. A lot of what we think of as human irrationality is actually rational behaviour in pursuit of biased goals. For example, in forming opinions we don’t always seek true opinions; we sometimes prefer opinions that justify our behaviour or give us comfort. “Everybody knows” that most human beings are stupid or ignorant; what psychology has shown is that all human beings have many directional, systematic biases.
2. The only real cognitive failing – the one with disastrous effects – is egotism. There is nothing wrong with knowing nothing about a subject, so long as you don’t offer opinions or make important decisions about it. There is nothing wrong with having a flawed memory or flawed judgement, so long as you don’t deny those flaws and insist you’re right. Egotism and overconfidence turn an error into a disaster.
3. A scientist’s conclusions reflect his or her values, but this isn’t in itself a bad thing, because there are distinctive scientific values, for example the value of truth, predictive power or informativeness. Not all values are value biases. Science is the best system we have for identifying and overcoming bias.
4. The scientific process requires that there is honesty, freedom of speech, that any idea can be challenged and that ideas prove themselves against evidence, not ideology. The political process of an open, free society requires that any idea can be challenged, that decision-makers can be held responsible, and that they justify their actions solutions rather than sticking to an ideology. Development of (one aspect of) personal virtue requires that we challenge our own ideas, admit our mistakes and learn from them, identify our own delusions a try to remove them. The three ideas of scientific inquiry, political liberalism and epistemic virtue have the same root, both historically and logically.