8.2 Problem-Solving: Heuristics and Algorithms

Learning Objectives

  1. Describe the differences between heuristics and algorithms in information processing.

When faced with a problem to solve, should you go with intuition or with more measured, logical reasoning? Obviously, we use both of these approaches. Some of the decisions we make are rapid, emotional, and automatic. Daniel Kahneman (2011) calls this “fast” thinking. By definition, fast thinking saves time. For example, you may quickly decide to buy something because it is on sale; your fast brain has perceived a bargain, and you go for it quickly. On the other hand, “slow” thinking requires more effort; applying this in the same scenario might cause us not to buy the item because we have reasoned that we don’t really need it, that it is still too expensive, and so on. Using slow and fast thinking does not guarantee good decision-making if they are employed at the wrong time. Sometimes it is not clear which is called for, because many decisions have a level of uncertainty built into them. In this section, we will explore some of the applications of these tendencies to think fast or slow.

Heuristics

We will look further into our thought processes, more specifically, into some of the problem-solving strategies that we use. Heuristics are information-processing strategies that are useful in many cases but may lead to errors when misapplied. A heuristic is a principle with broad application, essentially an educated guess about something. We use heuristics all the time, for example, when deciding what groceries to buy from the supermarket, when looking for a library book, when choosing the best route to drive through town to avoid traffic congestion, and so on. Heuristics can be thought of as aids to decision making; they allow us to reach a solution without a lot of cognitive effort or time.

The benefit of heuristics in helping us reach decisions fairly easily is also the potential downfall: the solution provided by the use of heuristics is not necessarily the best one. Let’s consider some of the most frequently applied, and misapplied, heuristics in the table below.

 

Table 8.1. Heuristics that pose threats to accuracy
Heuristic Description Examples of Threats to Accuracy
Representativeness A judgment that something that is more representative of its category is more likely to occur We may overestimate the likelihood that a person belongs to a particular category because they resemble our prototype of that category.
Availability A judgment that what comes easily to mind is common We may overestimate the crime statistics in our own area because these crimes are so easy to recall.
Anchoring and adjustment A tendency to use a given starting point as the basis for a subsequent judgment We may be swayed towards or away from decisions based on the starting point, which may be inaccurate.

In many cases, we base our judgments on information that seems to represent, or match, what we expect will happen, while ignoring other potentially more relevant statistical information. When we do so, we are using the representativeness heuristic. Consider, for instance, the data presented in the table below. Let’s say that you went to a hospital, and you checked the records of the babies that were born on that given day. Which pattern of births do you think you are most likely to find?

 

Table 8.2. The representativeness heuristic
List A List B
6:31 a.m. Girl 6:31 a.m. Boy
8:15 a.m. Girl 8:15 a.m. Girl
9:42 a.m. Girl 9:42 a.m. Boy
1:13 p.m. Girl 1:13 p.m. Girl
3:39 p.m. Boy 3:39 p.m. Girl
5:12 p.m. Boy 5:12 p.m. Boy
7:42 p.m. Boy 7:42 p.m. Girl
11:44 p.m. Boy 11:44 p.m. Boy
Note: Using the representativeness heuristic may lead us to incorrectly believe that some patterns of observed events are more likely to have occurred than others. In this case, list B seems more random, and thus is judged as more likely to have occurred, but statistically both lists are equally likely.

Most people think that list B is more likely, probably because list B looks more random, and matches — or is “representative of” — our ideas about randomness, but statisticians know that any pattern of four girls and four boys is mathematically equally likely. Whether a boy or girl is born first has no bearing on what sex will be born second; these are independent events, each with a 50:50 chance of being a boy or a girl. The problem is that we have a schema of what randomness should be like, which does not always match what is mathematically the case. Similarly, people who see a flipped coin come up “heads” five times in a row will frequently predict, and perhaps even wager money, that “tails” will be next. This behaviour is known as the gambler’s fallacy. Mathematically, the gambler’s fallacy is an error: the likelihood of any single coin flip being “tails” is always 50%, regardless of how many times it has come up “heads” in the past.

The representativeness heuristic may explain why we judge people on the basis of appearance. Suppose you meet your new next-door neighbour, who drives a loud motorcycle, has many tattoos, wears leather, and has long hair. Later, you try to guess their occupation. What comes to mind most readily? Are they a teacher? Insurance salesman? IT specialist? Librarian? Drug dealer? The representativeness heuristic will lead you to compare your neighbour to the prototypes you have for these occupations and choose the one that they seem to represent the best. Thus, your judgment is affected by how much your neibour seems to resemble each of these groups. Sometimes these judgments are accurate, but they often fail because they do not account for base rates, which is the actual frequency with which these groups exist. In this case, the group with the lowest base rate is probably drug dealer.

Our judgments can also be influenced by how easy it is to retrieve a memory. The tendency to make judgments of the frequency or likelihood that an event occurs on the basis of the ease with which it can be retrieved from memory is known as the availability heuristic (MacLeod & Campbell, 1992; Tversky & Kahneman, 1973). Imagine, for instance, that I asked you to indicate whether there are more words in the English language that begin with the letter “R” or that have the letter “R” as the third letter. You would probably answer this question by trying to think of words that have each of the characteristics, thinking of all the words you know that begin with “R” and all that have “R” in the third position. Because it is much easier to retrieve words by their first letter than by their third, we may incorrectly guess that there are more words that begin with “R,” even though there are in fact more words that have “R” as the third letter.

The availability heuristic may explain why we tend to overestimate the likelihood of crimes or disasters; those that are reported widely in the news are more readily imaginable, and therefore, we tend to overestimate how often they occur. Things that we find easy to imagine, or to remember from watching the news, are estimated to occur frequently. Anything that gets a lot of news coverage is easy to imagine. Availability bias does not just affect our thinking. It can change behaviour. For example, homicides are usually widely reported in the news, leading people to make inaccurate assumptions about the frequency of murder. In Canada, the murder rate has dropped steadily since the 1970s (Statistics Canada, 2018), but this information tends not to be reported, leading people to overestimate the probability of being affected by violent crime. In another example, doctors who recently treated patients suffering from a particular condition were more likely to diagnose the condition in subsequent patients because they overestimated the prevalence of the condition (Poses & Anthony, 1991).

The anchoring and adjustment heuristic is another example of how fast thinking can lead to a decision that might not be optimal. Anchoring and adjustment is easily seen when we are faced with buying something that does not have a fixed price. For example, if you are interested in a used car, and the asking price is $10,000, what price do you think you might offer? Using $10,000 as an anchor, you are likely to adjust your offer from there, and perhaps offer $9000 or $9500. Never mind that $10,000 may not be a reasonable anchoring price. Anchoring and adjustment does not just happen when we’re buying something. It can also be used in any situation that calls for judgment under uncertainty, such as sentencing decisions in criminal cases (Bennett, 2014), and it applies to groups as well as individuals (Rutledge, 1993).

Algorithms

In contrast to heuristics, which can be thought of as problem-solving strategies based on educated guesses, algorithms are problem-solving strategies that use rules. Algorithms are generally a logical set of steps that, if applied correctly, should be accurate. For example, you could make a cake using heuristics — relying on your previous baking experience and guessing at the number and amount of ingredients, baking time, and so on — or using an algorithm. The latter would require a recipe which would provide step-by-step instructions; the recipe is the algorithm. Unless you are an extremely accomplished baker, the algorithm should provide you with a better cake than using heuristics would. While heuristics offer a solution that might be correct, a correctly applied algorithm is guaranteed to provide a correct solution. Of course, not all problems can be solved by algorithms.

As with heuristics, the use of algorithmic processing interacts with behaviour and emotion. Understanding what strategy might provide the best solution requires knowledge and experience. As we will see in the next section, we are prone to a number of cognitive biases that persist despite knowledge and experience.

 

 

Key Takeaways

  • We use a variety of shortcuts in our information processing, such as the representativeness, availability, and anchoring and adjustment heuristics. These help us to make fast judgments but may lead to errors.
  • Algorithms are problem-solving strategies that are based on rules rather than guesses. Algorithms, if applied correctly, are far less likely to result in errors or incorrect solutions than heuristics. Algorithms are based on logic.

References

Bennett, M. W. (2014). Confronting cognitive ‘anchoring effect’ and ‘blind spot’ biases in federal sentencing: A modest solution for reforming and fundamental flaw. Journal of Criminal Law and Criminology, 104(3), 489-534.

Kahneman, D. (2011). Thinking, fast and slow. New York, NY: Farrar, Straus and Giroux.

MacLeod, C., & Campbell, L. (1992). Memory accessibility and probability judgments: An experimental evaluation of the availability heuristic. Journal of Personality and Social Psychology, 63(6), 890–902.

Poses, R. M., & Anthony, M. (1991). Availability, wishful thinking, and physicians’ diagnostic judgments for patients with suspected bacteremia. Medical Decision Making, 11, 159-68.

Rutledge, R. W. (1993). The effects of group decisions and group-shifts on use of the anchoring and adjustment heuristic. Social Behavior and Personality, 21(3), 215-226.

Statistics Canada. (2018). Homicide in Canada, 2017. Retrieved from https://www150.statcan.gc.ca/n1/en/daily-quotidien/181121/dq181121a-eng.pdf

Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5, 207–232.

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Psychology - 1st Canadian Edition Copyright © 2020 by Sally Walters is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book