Skip to content

Dr. Mark Crowder
Manchester Metropolitan University
20 January 22

Are your decisions really your decisions?

Even simple decisions can be a complex process, and inevitably some mistakes are made but our propensity to apply heuristics mean that the mistakes are more frequent and more far reaching, argued Dr. Mark Crowder in his presentation at the January event hosted by the ABP.  When researching for his thesis he investigate 530 decisions and found that not one was based on a full consideration of facts: they were influenced by what Kahneman describes as cognitive heuristics.  Furthermore we are susceptible to having our decisions influenced by others, especially professional marketeers, social media and influencers.

How is it that decision making, even by highly intelligent people, can be so poor?

The presentation considered the following:


  • What is a decision?
  • Models of decision-making
  • How can decision-making be manipulated?

Perhaps the best definition of a “Decision” is from Drucker (2001), the choice between two alternatives.  But, we need to find a way of evaluating the choices: the obvious one is based on the Rational Economic Model (Buchanan and Huczynski, 2004).  The problems start to arise when we go through the process and it rapidly becomes clear that there is “no single right way” to make a decision.  In choosing, for example, to marry your girlfriend, a rational approach would be to try it out with several girlfriends but the consequences of this approach would be less than desirable.  Furthermore, decisions may not be bad decisions: adverse business consequences can, and more often than not, result from poor communication.   What we are not good at doing, however, is identifying the real nature of problems hindering our ability to make rational decisions, and inevitably cognitive heuristics take over.   And it is how we approach this which provides clues as to how we can optimise our decision making in prevailing circumstances.

It is clear that there are two approaches: a normative (rational) one, where the key consideration is a structured approach, where we consider all the options, and a more behavioural one, where we only use some of the information, recognising that there is no time to consider the consequences.

The Normative Approach

The Vroom/Yetton, Cynefin and Brunswick Lens models enable us to address issues such as “Can I afford the choice” or “is it legal” and we can take a series of steps to improve the making of decisions.  The Cynefin is a good example of these approaches with its “5 habitats” of circumstances to assist the logic.   But normative models run into difficulties because, for sound reasons, the benchmarks can and even may need to be altered.  This is especially true in recruitment, where instead of assessing candidates on a pre-determined standard, the benchmark can be derived from one candidate, which in turn becomes the “standard”.  Similar issues can arise in Public Sector construction contracts.  The point is that predetermined organisational criteria based on fact may need to be modified based on other criteria and are then jettisoned and original procedures may not be not followed.

The Behavioural Model

The alternative is the Behavioural Model, where there is only limited use of the available information/data.  This is usually much closer to how decisions are actually made.  Three examples were made, “Bounded Rationality”(March and Simon 1958), Naturalistic Decision Making (Zsambok and Klein, 1997) and the use of Cognitive Heuristics.  As an example of Zsambok’s work, Mark quoted an air evacuation scenario to limit deaths from aeroplane disasters: once a reward was introduced the cognitive heuristics dynamics changed completely and the situation had transformed to a rush for the exit and passengers blocking it. Clearly the way to limit deaths is to make the most of the available time to evacuate and exit in an orderly fashion and not to try to use an superficially rational approach as a route to decision making, which in the real world can lead to catastrophe.

The importance of Cognitive Heuristics in how the Brain uses Recognition and Experience

Mark challenged us to think about a photo of the Queen: how do we know it is her in the photograph?  We think we know but we don’t know.  This is a good illustration of Kahneman and Tversky’s work on “Recognition Heuristics” and related areas which demonstrates the shortcuts and simplifying tactics we use in processing information, in this case stereotypes and generalisations, often from a range of sources, where we make best guesses or estimates.  An excellent example of how the decision to offer wine freely at Disneyland Paris: this was done on the basis of “stereotyping” because wine is part of the local culture, whereas Disneyland Florida does not have a history of wine drinking.

Kahneman and Tversky (1982) proposed the concept of “Anchor and Adjustment”, most readily observed in recruitment, where one candidate can set the benchmark as being representative of a revised desired standard, which is a deviation from the original intention. Heuristics are questioned in these situations but decision making on everyday situations such as crossing the road can be based on experience.  No one will ever count the cars or accurately measure their individual speeds before deciding whether to cross the road.

The question of Cognitive Heuristics continues to generate fierce debate.  This is led by

  • The Heuristics and Biases School, led by Kahneman, Tversky, Sunstein, Ariely and Slovic whose view is
  • Heuristics only use part of the information and may miss something crucial
  • Heuristics are prone to influence (bias) and lead to error
  • Heuristics should be minimised in favour of rational approaches
  • Gigerenzer, Hutchinson, Hoffrage and Todd of the “Fast and Frugal School” which suggest that
  • Heuristics make fast decisions
  • Make good use of limited information (Frugality)
  • Heuristics are just as accurate as rational approaches
  • Heuristics are efficient and should not be discouraged


Manipulation of Decision Making

It is remarkable how people’s perception of conclusions can be manipulated by how the facts are presented.  Mark gave a couple of examples:

  • An urban legend was discussed where the same information was presented in two different ways (i.e. framing).   In the first version, the information was deliberately presented to make a person look bad.  In the second version, the same information made the person look good (or at least, it was open to interpretation).   In our discussions, sure enough, the reaction from delegates was that the person was bad in the first case but good in the second case.  In other words, perceptions were manipulated by the wording of the case
  • A man worked for a local authority agency which was trying to attract jobs and reduce unemployment.  The first time he reported more or less unchanged unemployment figures.  The next time these figures were reported, the data was presented in a different way, instead focusing on employment levels and highlighting people in work which had grown significantly.  The man got a £5000 pay rise for his work and efforts.
  • Misinformation is not restricted to text: it can appear in images, as the example of a map of Australia split into two can reveal a dog (terrier) head and a cat’s head.
  • The Divorce rate in Maine is very close to the per capita consumption of margarine.  Why is this important? Is this relevant?  It is perhaps an interesting fact but clever marketers can talk this up into something relevant, accurate and pertinent which can influence purchasing decisions.

What these examples highlight is “correlation” not “causality”.  The information may have a relationship but the point is that the brain is manipulated by the way the data patterns are presented and hence formed.

The results are even starker when asked to remember how many red lights you encountered on the way home from work, or, how many different families of words are read aloud and the listener is asked to remember whether specific words features in these groups: some were read and some weren’t.  It is particularly hard to retain information fired quickly when the brain is focused so much on the different group itself rather than the individual words: the point about this exercise is that the brain defaults to focusing on elements highlighted in a brief, but then is unable to observe even obvious unusual detail when asked to recall even simple units of information.  These spots are referred to as “Perceptual Blindness”.

Marketers are aware of these blind spots and our inability to move beyond cognitive heuristics in the decision making process.  They can use them to actively seek to influence decisions.  Consumers have an innate preference for brands with which they are familiar such as Cadbury’s chocolate or Ariel Washing Powder: they are also influenced by what they perceive as a (for example 3 for 2) bargain even though what is presented as a bargain, is in fact not so.

Supermarkets specialise in “upselling”, presenting products which may satisfy some kind of need but which is not essential, on the way to purchasing staple foods such as bread and milk.  Perhaps the most startling example is that of a range of watches: all are displayed showing the time at “Ten to Two” because they demonstrate an image of a “smiling” friendly watch.

In conclusion, heuristics can manifest themselves in several ways, “cognitive”, “availability”, and “representation”.  Anyone using them, intuitive or choice based. Our susceptibility to heuristics can make us vulnerable to manipulation of text, data, images or anything which can then portray a situation or object to be more desirable than it actually is.  The reason why it is so easy to trick the brain is because, even though Kahneman’s work dates back to the early 1980s, this area of science is very much in its early stages of development and there is little progress in understanding the dark side of the brain.        This is a fertile area for further research, but in the meantime marketers continue to exploit our biases and heuristics and we continue to ask the question “Are our Decisions really our Decisions?”




Back To Top