- 20 tips for better decision making
- Business Intelligence Training
- 100% independent BI advice
- Data Science for Decision Makers
Fight the top 20 biases and make better decisions
We are poor decision makers and our brains function only moderately, according to numerous studies. The process of decision making has long been a black box, but step by step we are gaining more insights. Now that data-driven work is becoming the norm, improving decision making is a must. That’s why we crawl into the brains of decision makers to see what decision making is, how the brain works and what you can do to combat biases. Why do educated and generally wise managers and dedicated, motivated employees make bad decisions? How do you prevent biases and noise during the crucial process of decision making in organizations? This blog is about decision making and the tough fight against biases. Consult our 20 tips and immediately take your first step towards better decisions.
In this article 🕑 9 minutes
- What is a bias? (meaning)
- What is noise?
- Two examples of noise during decision making
- The 20 best tips for pure decision making
- Simon’s Decision Making Process
- Don’t let decision making get out of hand
- Decision making: not choosing is also choosing
- Avoid the most important pitfalls during decision making
- Two classic videos about decision making
What is a bias? (meaning)
There are many meanings for the word bias (from different fields such as statistics and psychology) but one of the most powerful definitions of bias we use here is:
A bias is a systematic deviation in human judgments and investigations.
Within statistics, bias means a systematic error, or so-called non-sampling error that produces a bias (a deviation from the correct result) that has a systematic cause. In other words, it is an impurity, a deviation from the expected and/or true value.
In psychology, bias has an entirely different meaning. Psychologists speak of cognitive bias when there is an irrational, erroneous train of thought.
Managers and decision makers, but in fact everyone, should be much more aware of the subjective nature of our human brain and should study the psychology of choice. Biases undermine the process of pure decision-making and lead to suboptimal decision making.
What is noise?
Kahneman defines noise as the unwanted variability in assessments of identical problems. In his book “Noise,” he provides numerous examples.
Two examples of noise during decision making
Both bias and noise, referred to respectively as systematic deviation and random distribution, are different components of errors in human judgment that can lead to wrong decisions. Two examples for clarification.
Example 1: legal inequality due to noise
In a criminal case, two or more different judges should in principle reach the same verdict based on the same facts and circumstances in an identical case. In reality, noise throws a spanner in the works and the judgments are miles apart. There are mild and severe judges, for example, and mood and environmental factors, such as the weather, play a role in sentencing. Sentencing varies tremendously, creating legal inequality.
Example 2: mismatch due to noise
Reviewers in a job application process should be able to unanimously identify the best candidate based on objective data, such as education, work experience, skills and so on. In reality, recruiters and members of an application committee also unconsciously let their likes, dislikes and preferences weigh in, and group dynamics distort an objective assessment. The wrong candidate is hired. The cost of such mismatches is enormous.
The 20 best tips for pure decision making
Healthy, sensible choices are those that have not been distorted by a variety of biases, fallacies or opinions. Below we give twenty examples of cognitive biases that affect our decision making and Business Intelligence results. At the same time, we give you tips on how to combat these biases.
Just as you fight sleep during an overnight car ride, you should also fight biases (every day). That may sound exaggerated, but making the right decisions is vital for organizations. Every manager, decision maker, data analyst or BI specialist should therefore be able to recite the following biases by heart:
- BANDWAGON EFFECT. People trust, often wrongly, the opinion of the majority. Psychologists speak of the bandwagon effect. Even if people have seen with their own eyes that a view is wrong, one third of the people will unquestioningly endorse the right or wrong opinion of the majority. Regardless of what that view is. Therefore, always rely on your own perceptions and data.
Figure 1: The mechanism of cognitive dissonance visualized
- COGNITIVE DISSONANCE. Accept that sometimes you make wrong choices. Managers and employees don’t like to make wrong choices. Then they bale. Therefore, they ignore or reject new information because it does not fit the “consistent” picture they have built. They don’t feel like having to change their behavior or opinion. This is a form of cognitive dissonance (Festinger, 1957).
- CONCRETE VERSUS ABSTRACT PARADOX. Base your decisions on a consistent and complete data set, not on incidental experiences. A bias occurs when concrete information overshadows abstract information such as summaries or reports (concrete versus abstract paradox). A customer service manager who gets a call from two angry customers about delayed deliveries has less faith in statistics that tell him the total number of delayed deliveries is at an all time low. People have a great penchant for concrete stories (storytelling).
Figure 2: The confirmation bias only creates fixed ideas
- CONFIRMATION BIAS. In confirmation bias, a person will only look for examples that are positive given the hypothesis (Vandenbosch, 1997). If someone thinks that a specific product is performing poorly and he expects that this product is responsible for the red numbers, he will only analyze the data of the product in question. If the product is indeed performing poorly, the initial hypothesis is considered correct. While other products may have contributed equally or even more strongly to the red numbers. Therefore, don’t get stuck in your own entrenched ideas.
- CONFOUNDING. Monitor the composition of the research and control groups. Confounding is a bias that occurs when a third variable, which is related to both the independent variable and the dependent variable, distorts or causes the causal relationship between the two. For example, if you want to measure the effect of a medical treatment, both the research and control groups must have the same average age or overall baseline condition of the patients. Otherwise, you quickly draw wrong conclusions.
- CORRELATION VERSUS CAUSATION. Don’t confuse correlation with causation. Relying only on positive “hits” is dangerous. A person observes a causal relationship between two events supported only by observations where only both events occurred. They disregard other events.
- FRAMING. Framing is the deliberate selection and presentation of words or images with the purpose of evoking positive or negative associations in the recipient. The ultimate goal is to influence opinion on a particular topic or product. Framing as bias is often used in politics, but also in business, to “push” people in a certain direction of thinking. A word like “climate terrorist” is an example of a negative frame. By stating that the glass is half full, rather than half empty, you are using a positive frame. By “framing” issues, you create a one-sided, subjective image. So beware of framing.
Figure 3: According to Gestalt laws, our brain makes a logical reconstruction of the whole
- GESTALT LAWS. Understand the Gestalt Laws. We see too much structure and pull things too far. According to Gestalt laws, people naturally look for patterns, symmetry, unity and consistency. Where things appear to be one unit, they often turn out not to be one unit, but interrupted. Where things appear regular, they turn out to be random. This can easily create a false overall picture. People often perceive a complete circle while it is not quite closed. We make a logical reconstruction of the whole, while we sometimes miss a piece of information. Incidentally, we also often use this to our advantage.
- GROUP POLARIZATION. After a group discussion, people come out with an even more extreme point of view, regardless of what was discussed during that discussion. Jurors who already do not trust the defendant at the outset are only reinforced in their opinions after the group discussion. They complement each other’s arguments and are then confirmed in them. This is called group polarization. Related to this is the group thinking mechanism, in which group members omit their own opinions in order to reach a consensus. Therefore, be alert to group polarization.
- HALO EFFECT. Make a new, objective assessment each time. The halo effect is the tendency of evaluators to judge a person, organization, or product positively based on one perceived positive character trait, performance, or product feature. People assume, based on that one positive assessment, that the other traits or qualities will also be present.
- MEMORY BIAS. Realize that people have a bad memory. We speak of a memory bias, for example, when, for the purpose of a study, patients remember certain things differently from the way they really happened, such as whether or not they took medication.
- INFORMATION BIAS. Define the parameters unambiguously. Information bias is caused by measurement error, or the difference between what you measure and what you want to measure. The error can arise from the lack of an unambiguous definition of the parameters. Also, the bias can be due to inadequate information that a patient gives to a doctor, for example, such as not reporting a particular complaint.
- PRIMING. Always be alert for manipulation. Priming, according to Wikipedia, is the cognitive phenomenon that a stimulus evokes a faster or stronger response in the brain if that stimulus has been previously observed. For example, in an experiment conducted by Yale University after a brief job interview, subjects had to make a choice to hire or reject a project manager. On the elevator ride up, half of the subjects were pushed an ice-cold cup of Coke into their hands. The other half gets a mug of hot coffee. During the evaluation of the interview, the “cold” group had strong doubts about the suitability of the candidate. The “warm” group was very enthusiastic about the candidate.
- OVERCONFIDENCE BIAS. Know your own limitations. Managers, as well as professionals, structurally overestimate their abilities and judgmental qualities. Psychologists speak of an overconfidence bias. People have an excessive self-confidence that is not based on facts. For example, it’s always others who can’t drive.
- CHOICE BLINDNESS. In choice blindness, we defend our choices after the fact, even when they are not our own. In an experiment, a researcher asks a subject to choose from two portrait photographs. The researcher asks the subject which person the subject would like to have a snack with. After the choice is made, the researcher uses a trick to hand over not the chosen photo, but rather the other photo. Then the researcher asks why the test subject chose that photo. In over 70% of the cases, the test subject explains why they chose that particular photo, even though it is really a different photo. Sometimes the first face photo even showed a person wearing glasses, while the other photo showed someone without glasses. Therefore, always stay focused.
- SELECTION BIAS. Make sure your sample is representative. Selection bias occurs when the selected sample does not accurately reflect the population you are investigating.
- SELECTIVE PERCEPTION. Expand your field of vision. We see only what we want to see or expect to see. This phenomenon of a narrowed field of view is called selective perception (Dearborn and Simon, 1958). Human beings describe and visualize the world according to their own abilities, experiences, and tools. Do you only have a hammer? You then perceive many “problems” as a nail. When a Dutchman is abroad, it is the patriotic license plates that stand out.
- CHANGE BLINDNESS. Don’t get distracted. People often completely miss major changes. This is called change blindness. An applicant reports to the reception desk of a company. A blond gentleman with a pink shirt welcomes the applicant. The blond then disappears behind the counter and his colleague with a blue shirt and brown hair comes up in place of the blond and hands the applicant a questionnaire. Moments later, the interviewer asks the applicant, “Did you notice anything unusual at the front desk?” More than 50% of the applicants say they did not notice anything unusual.
Figure 4: Loss aversion visualized in a graph
- LOSS AVERSION. Take your losses in a timely manner. In humans, the pain of losing an amount weighs twice as much as the pleasure of gaining the same amount. Humans naturally have an exceptional aversion to loss. Kahneman and Tversky call this loss aversion. Loss aversion impedes pure thinking and can lead to irrational and even risky decisions. Even when it is not about money. Patients who are told by their doctor that a certain treatment offers a 90% chance of survival, choose that treatment more often than patients who are told that there is a 10% chance of death.
- WISHFUL THINKING. Avoid wishful thinking. Concluding too quickly and wishful thinking are well-known phenomena. People draw conclusions too quickly and thus arrive at an incomplete judgment of situations rather than step by step and meticulously ruling things out. People additionally believe something because it is advantageous to them, while the facts point to the opposite. You can hope, believe or wish that a virus or recession will soon disappear again, but for pure decision making it is better to base yourself on (scientific) facts.
The above list is certainly not exhaustive. On Wikipedia you can find many more biases. To understand how you can avoid biases, it is important to also consider the process of decision making.
Simon’s Decision Making Process
Figure 5: Simon and his four stages of the decision making process
Early on, Herbert Simon made great efforts to map out the human decision making process. Limited rationality or bounded rationality refers to the limited availability of information, cognitive limitations and the limited time to reach a clean decision. Our brain falls short. The computer offers a solution we often think. Thanks to increased computing capacity, algorithms can nowadays take thousands of parameters into account during the decision making process. Human interpretation sometimes remains necessary.
One of the most important lessons to be learned from Simon is that you have to learn to understand data.
Based on Simon’s information, vendors such as IBM developed the first automated decision support systems. One of the most important lessons you can learn from Simon is that you have to learn to understand data. Always look critically at the scientific basis of a described bias, particularly the validity: are you measuring what you want to measure?
Don’t let decision making get out of hand
In his famous essay The Abilene paradox: the management of agreement (1974), J.B. Harvey puts his finger on another sensitive spot: a group can take a decision that none of the individual group members would ever have chosen. No one wants or dares to break his loyalty to the group. Making a different decision feels like betrayal from the start. When too little attention is paid to alternative perspectives, the shared information bias arises. For example, many strategy meetings lead to disappointing, wrong, choices because the groupthink mechanism encourages us to make decisions as a group based on average positions. This leads to weak decisions that are then often a reason to call another team session after all.
Decision making: not choosing is also choosing
Not choosing is of course also an option. But the point is: you don’t want to be forced to make a choice later on in the decision making process. If you don’t make a choice now, the market will probably make it for you later. Unfortunately, in such situations the choice is rarely in favor of your company.
Avoid the most important pitfalls during decision making
What can you do in general to make better choices? How do you avoid some of the above pitfalls?
It’s nice to agree with each other, but unanimity doesn’t usually lead to the best decisions. You can avoid this pitfall. Before you actually make a decision, pause and force all group members to express all their reservations about the choice you are about to make. The quality of your decisions will increase.
Put a youthful person in the board. It turns out that as time goes on, problems are made more complicated: overrationalization occurs. Young children often come up with the simplest and most effective solution to a problem.
The 5 Why’s: start in-depth research and continue to ask questions
During decision making, use the “five why’s” technique. Ask the why question five or more times. In this way you get closer to the essence of why you really want something or what causes the problem. You want to grow? Why? Because it is necessary. Why? To build up reserves. Why do you need reserves? Because a recession is looming? Why are we not recession-proof? Because we only supply consumers. Why was that chosen? Etcetera.
Five considerations in decision making
- Choosing does not limit, it gives you maximum freedom within a new framework.
- Always check that the choices you make in your plans are logical and valid and that fear or other emotions do not dominate.
- Also check that your choices and the preceding observations you have made have not fallen prey to the traps in your cognitive system.
- Apply the principle of Karl Popper, philosopher of science. Try to prove the opposite of something, instead of looking for those cases that are positive for a specific option.
- Avoid “yes-men,” “yes-but figures,” and “yes-no figures” (saying yes and doing no). They are a danger to good decision-making and therefore a danger to the continuity of the company.
Two classic videos about decision making
Finally, our decision making has dark sides, as the BBC revealed in the documentary How to make better decisions. For example, people are more likely to believe anecdotal evidence than the scientific evidence of experts. And people have a strong preference for attributing more truth to the concrete than to the abstract. And so there are numerous snags that impede making good choices. In the documentary mentioned above, oversimplification, laziness, and bias are listed as the three main obstacles. This does not even include the power of our subconscious and the power of environmental factors.
The follow-up documentary How You Really Make Decisions will also be an eye-opener for many people. These documentaries show various experiments that demonstrate how simple it is to influence and manipulate the process of deciding and decision making.
The quality of human decision making processes is pretty lousy. Usually our choices are poor or irrational. It seems as if there is an active autopilot that takes over and paralyzes our thoughts. From now on, consider decision making as a standardized form of decision-making. Make visible the considerations you have made and be extra alert for possible biases and noise that cloud the decision making. Fight biases like a lion.