Given a choice between a risky decision and a safe decision, people choose differently depending on whether the payoffs are described as a gain or losses. This is known as “loss aversion”. Laurie Santos and her colleagues worked out how to give monkeys a choice that could be presented either as a gain or loss. Their choice patterns matched the behaviour of humans, as she reveals in this TED talk which really gets going after about eight minutes. It turns out that “a monkey financial advisor is just as dumb as your human financial advisor.”
Continuing my coverage of the blogs that discuss confirmation bias, I’m pleased to see Scott H Young’s post “Why I’ve Decided to Be Wrong More Often”, in which he discusses what you can do to be less biased in life. He recommends deliberately seeking out contrary opinions.
I also leave myself open to being wrong, and seek out ideas that disagree with me. I try to read books from authors with whom I disagree with. I pay most attention to commenters who argue against an article I’ve written.
More central to his strategy is to accept that mistakes are inevitable, to prepare for them and to learn from them. This involves being open to the possibility of being wrong on quite fundamental things, including political or cultural beliefs that are tied up with personal identity. It also involves self-forgiveness: accept that you were wrong, learn and move on.
My goal is to be wrong about one big idea in my life, business or philosophy every month. I know if I’m not having big moments of wrongness at this frequency, it’s almost certainly because I’m ignoring other perspectives, not because I’m infallible.
There’s an analogy with being a venture capitalist: if all of the projects they back turn out to be viable, it suggests they would haven’t taken enough risk. Venture capitalists will aim to have a certain proportion of failures among the start-ups they back, though obviously at the time they don’t know which will succeed and which will fail.
I’ve been following with interest the discussion of confirmation bias on Twitter, blogs and Wikipedia discussion. My favourite of the blog posts is one by Rogue Medic which expands on and talks through the Francis Bacon quote which appears in the Wikipedia article. It highlights another area where confirmation bias can lead to disaster.
A year ago, the Wikipedia article on Confirmation bias was in a poor state. Whoever had written it was well-intentioned but they’d been working from a small number of sources and perhaps hadn’t seen the big picture. I started a substantial rewrite. The community gave me a lot of help to make the text accessible, and a couple of weeks ago it reached the highest quality standard on Wikipedia: Featured Article. (“Confirmation bias” as it was on 10th August 2009 vs “Confirmation bias” now).
This week I learned it has been chosen as “Today’s Featured Article” for tomorrow (Friday 23rd July). A one-paragraph summary will appear on the front page, where it can be seen by around four million users. Around sixty thousand will click through to the article itself. It will also be seen through the dozen or so sites that mirror Wikipedia. With this new prominence, it is more likely the article will be translated into other languages (extracts have already been translated into Spanish and Catalan). The are also other delivery platforms: I’m already planning a spoken version of the article, but won’t have time to do it before tomorrow. Being naturally the first Google hit for “confirmation bias”, the article has a high prominence (getting nearly a thousand hits per day) and it is regularly recommended and discussed on blogs, online communities such as Reddit.
So, it’s fascinating to watch the ripple effect of this article to which I’ve contributed. Confirmation bias is something you’d definitely hear about if you do certain courses within a psychology degree, but it’s not exactly the sort of topic that you would expect to see stories about in the newspaper or the evening news. Hence it’s significant that perhaps millions of people will hear about it through this article. To be honest, this provokes mixed feelings. Read the rest of this entry »
Bad Science author Ben Goldacre has long wondered aloud why science stories in the media have to be purged of the crucial terminology such as “double-blind” and “randomisation” when the sports pages and financial pages are full of terminology that’s particular to those audiences. Blogger Mike Knell takes the idea a bit further in a well-observed satire.
Dan Pink’s talk (previously featured) and some wonderful, witty animation combine to make a short film about the psychology of motivation. This illustrates why the economic concept of incentive is problematic: it’s just not the case that more monetary incentive means that work will be done more enthusiastically. Pink comments on the success of projects such as Wikipedia which are dependent on free labour.
The above is an extract from a forty-minute talk that you can see in full without the illustration.
This morning I have mixed feelings from seeing something I’ve worked on being heavily praised, but for the wrong reasons.
Erik Fernandez, a blogger, has created a slide show about cognitive biases. I haven’t examined it carefully, but it seems like all the text is taken, or at least lightly adapted, from two Wikipedia articles; Cognitive bias and List of cognitive biases. I know this because I recognise my own text in the slide show. Under the terms of the Creative Commons licence, Eric is entitled to copy this material and make derivative works, but not to pass it off as his own work.
These articles are a long way off finished, and in their partial state they can be actively misleading. As one of the authors, this is partially my fault. It’s better than nothing, but they’re not ready for wide publicity.
That’s why I’m concerned that over the last couple of days, the slide show is getting a huge amount of attention by being featured on the high-traffic blogs BoingBoing and LifeHacker. These blog posts treat the slide show as an original work and make no mention (because Fernandez doesn’t) of where the text comes from. Read the rest of this entry »
Derren Brown gave us some interesting skeptical TV viewing this week, investigating the Liverpool based “psychic medium” Joe Power in a Channel 4 documentary. Brown has previously covered the topic of contacting the dead in a special called “Messiah”, which I highly recommend.
It seemed pretty clear to me that Power was using a combination of “cold reading” (drawing out information from the sitter and feeding it back) and “hot reading” (using information obtained in advance): techniques that Brown and Richard Wiseman explained during the programme. In the heated final exchange, Brown tried to get Power to admit to being a “fake” and asked him “How do you sleep at night?” Power responded with righteous indignation.
If a so-called medium is using non-paranormal means to create effects or paranormal powers, does that mean they are consciously faking? Maybe not. I’m going to argue that a lot of psychics fall into the space between “genuine” and “knowing fake”. Read the rest of this entry »
Would you rather have 100 pounds in 12 months’ time, or 110 pounds in 13 months’ time? Most of us would take the latter option and wait just a month longer for 10% more money.
But now, move everything forward by 12 months: would you rather have £100 right now, or wait a month for £110? Suddenly an extra month seems a long time to wait: many of us would rather take the hundred.
This is an example of what’s called a preference reversal. The choice is essentially the same, but merely by transposing the choice in time, we can affect which option people prefer. Even if £110 seems more attractive to you in both cases, there will still be a similar reversal when we fine-tune the times and amounts of money. Read the rest of this entry »