In managing the R&D pipeline we are constantly called upon to make decisions and estimates on the basis of thin or inadequate information. We have to estimate the time that a particular task may take, or how many people will be needed, and decisions have to be made about whether to start a project or cancel it. But there are always many unknowns, particularly in the early stages of a project. So everyone knows that however hard you try decisions have to have a good element of intuition in them.
Intuition can be very powerful: a doctor can often tell almost at once what the problem is, and an art historian may be able to spot a fake with barely a moment’s study. Their intuitions are very reliable because they are based on a great deal of relevant experience and practice. But when faced with novel situations (as so often when we make judgements about R&D projects) our intuition is surprisingly easily led astray. There’s a whole body of work on this in the field of Behavioural Economics.
One startling example is Anchoring: our tendency, when called upon to make a judgement on something unfamiliar, to be biased by a recent input. Try asking two groups of people separately to estimate, say the highest point in Australia, or the age of the oldest tree in the world. But start by suggesting a figure, high for one group, low for the other, and asking getting them to state whether the height or age is more or less than that. Then ask for their actual estimate. The estimates they give are surprisingly skewed by the suggestion given. According to Kahneman the effect on the estimates is typically one third to one half of the difference between the “anchors”. His hugely popular book, ‘Thinking, Fast and Slow’* gives an easily accessible overview of the issues. Some other sources of bias that he discusses are:
- Framing. The way a proposition is presented can have a significant effect on one’s reaction to it. For example people place more value on (and indeed are more willing to take risks for) avoiding a loss than achieving a gain. The way the proposition is presented influences decisions
- The ‘halo effect’. One well-rated aspect of a proposition improves the rating given to other aspects, even if they are unrelated. For example, appearance or physique affects how we judge a person’s opinions: the views of a handsome or impressive person tend to command respect. More powerfully, the opinion of high-status individuals such as senior managers tends to carry more weight than that of others, even in subjects where they have no special expertise. We have all witnessed, and perhaps been guilty of, laughing loudest at the boss’ jokes, even though bosses are seldom appointed because of their skill as comedians…..
- ‘The law of small numbers’. We tend to assume that a small number of samples (for example our own experiences) gives a much more accurate picture of an uncertain reality than it does. Interestingly this applies just as strongly to professional statisticians when not acting professionally. One of the effects is to make us believe that our own experience, even if very limited, is a better guide than it actually is.
- Group think. Janis** showed that the desire for harmony or conformity in a group can result in poor decisions, held with high confidence. It seems that groups of humans – especially males – often come to a shared decision in which the confidence of each person reinforces that of the others, leading to an exaggerated sense of certainty. Anyone who has seen the Hollywood classic “Twelve Angry Men” will recognise how it takes strength and resilience to take a different point of view. Group Think has arguably led to several high-profile poor decisions such as those that led to the Challenger Space Shuttle tragedy of 1986 and the disastrous “Bay of Pigs” invasion of Cuba by the USA. Kennedy learned from that experience and took steps to prevent an easy consensus developing when he later faced the Cuban Missile Crisis.
- Availability. When asked to judge how likely or common something is we are strongly influenced by how readily instances of it can be called to mind. Recent experiences, or things often read about or discussed, tend to bias decision-making.
What can we do about this?
The first thing to recognise is that everyone has a different background of knowledge and experience, and we are all biased in different ways. So it is always advisable to pool the experience of a small team of people rather than relying on a single expert – particularly as the person with the most understanding of the project is likely to be biased in its favour. Then you can help the team to be as objective as possible by:
- Collecting as much relevant information as possible and presenting it to the team members in an objective way (see the paper on GlaxoSmithKline’s portfolio approach).
- Getting the participants to study the information and form their opinion by themselves before they meet together to thrash out a decision. This not only helps to avoid Anchoring, Halo effect and Groupthink but also gives people time to check on things they don’t know rather than being forced to react “off the cuff” in the meeting. At least, give people time in a meeting to think individually and post their ideas on the board.
- Either collecting all the views before the meeting, or by having participants reveal them simultaneously (“Planning Poker”)
- Reaching a decision by debate rather than, for example, taking an average of the opinions. After all, someone with one of the more extreme views may have knowledge that nobody else does and may be able to persuade the others. But it may be best if the highest-status person speaks last.
- If it is a major or contentious decision suggesting “let’s sleep on this and see if we all feel the same in the morning”
Read more:
*Thinking, Fast and Slow, Daniel Kahneman, Allen Lane 2011
**Groupthink Edition 2, Irving L Janis, Houghton Mifflin 1982
Recommended by and post by Rick Mitchell.