Perhaps the most important one to be aware of is Anchoring Bias. This is the base line we construct to make our decisions. In most cases it is focussed on, or already reliant on, one piece of information. For example, if colour was the trait you most relied on when choosing your first car, it may continue to be so when you buy all subsequent cars, even though it may not be really so significant. Once the anchor is set other judgements are made by interpreting information around that anchor. For example in a sale a lower price for a phone will seem reasonable because you are measuring it against your initial value set, even if the price is still considerably higher than what the phone is worth.
Availability Heuristic Bias
As we have said behind all the biases is a mental shortcut. The availability heuristic bias is relying on information that comes to mind quickly. If there is a number of related events or situations that spring into your thoughts when you are making a decision you give greater credence to this information. If something is in the news or has been repeated many times on social media it will probably spring to mind. It is why, for example, most people would think being a police officer is a far more dangerous job than being a logger, whilst the reverse is in fact true. Equally surprising, due to our availability bias, is finding out that more people are killed by cows each year than by terrorists.
The bandwagon bias is a kind of groupthink, based on the rule that behaviours and beliefs spread among people just as trends do. As more people come to believe something others jump on the bandwagon, despite the lack of underlying evidence or sometimes even in the face of a strong counter argument. We don’t analyse the information. This can be because of a desire to conform or simply because we are so used to getting our information this way. We just step on board. One of the strongest examples of the Bandwagon effect is in politics, where a campaign that is on the ‘up’ can gather speed and supporters incredibly fast.
Outcome Bias is the tendency to place too much importance on the outcome of a decision rather than the process by which the decision was taken. When we make a bad decision that results in a poor outcome we are much more likely to be self-critical than when we make a bad decision that results in a neutral or positive outcome. The example often given is that of the poor manager who makes a decision on ‘gut instinct’ rather than the advice of a team who are strongly in favour of the opposite decision. If the outcome is positive then the poor manager will consider his process to be a good one, when his team’s rationale may have been a lot more sound. If the outcome is poor the manager will be much more likely to listen to his team in the future. Another important example is that if we take a dangerous decision (say drink driving) that does not result in a poor outcome, we are more likely to take the same decision again.
The Placebo Bias is probably the best known. we have all heard of medical experiments where a certain proportion of the test cases were given a placebo (usually a sugar pill) instead of the drug being tested. For many the placebo is just as much of a miracle treatment as the ‘real’ painkiller or treatment. It is not a deception or a self delusion; those that are ‘cured’ are not cheating, lying or insane. They aren’t hypochondriacs who were not ill in the first place. The placebo cure rate varies from 15 – 72%. Generally the longer the period of treatment and the higher the number of physician visits the greater the effect. It isn’t restricted to repair of pain or mood – physical changes are real. For example, studies in asthma patients for whom placebo drugs have worked and show less restriction of bronchial tubes. It is a direct result of brain action. It is a product of expectation, the product of chemical changes. The human brain anticipates outcomes and anticipation produces these outcomes, as if the brain were producing its own desired results. And the greater the pain the greater the relief, the more relief we desire the more we attain.
There are a few fallacies, common beliefs that undermine resilience, that it’s worth detailing briefly. The first, linked to the personalising is the fallacy of always being right. This can be a persistent belief for those who have developed the habit and it makes contradictory evidence very hard to accept. There is the fallacy of ‘heavens reward.’ A great many of us expect sacrifice and self denial to pay off, as if someone is keeping track of our various deeds. Then when this expectation isn’t met we become resentful and cynical.
Two others are the fallacy of change and the fallacy of fairness. The fallacy of change is the expectation that we will be able to change people. The fallacy of fairness is our belief that we understand what is fair and other people don’t. This leads to resentment and makes relationships very hard to sustain.
Looking For More Book Resources?
Extra content and additional useful exercises available for Unlock You readers.