by Ben Newell, The Conversation
This is the fourth article in a series, How we make decisions, which explores our decision-making processes. How well do we consider all factors involved in a decision, and what helps and what holds us back?
On June 24, 2013, Nik Wallenda took a very risky decision. He stepped out onto a 426-metre-long wire suspended 450 metres above the rocky floor of the Grand Canyon. He had no safety harness and no safety net. Twenty-two minutes and 54 seconds later, after being buffeted by gusty winds all the way across, he reached the end of the wire and the security of the other side.
Was Nik’s decision to step onto the wire a good one? And how did he make it?
To answer the first question, analytic approaches to decision-making would prescribe a process of assessing the likelihood and value of success (the “happiness” that achieving the walk would bring Nik) against the likelihood and (ultimate) cost of failure.
This cost was not abstract: Nik is the seventh generation of tightrope-walking Wallendas. Many of his relatives have died performing.
Answering the second “how” question at one level is easy. He put one foot in front of the other. At another level it becomes much harder: what kind of information leads us to choose in particular ways?
Experience vs description
Many of the most successful theories of decision-making were developed for situations in which outcomes and their probabilities are known and well-specified. The most basic tool of the decision scientist is simple gambles.
People prefer a sure thing to a gamble, but most decisions aren’t as easy to quantify. Jeff Kubin/ Flickr, CC BY
Consider the choice between $30 for sure or a gamble that gives you an 80% chance of $40 and 20% chance of nothing – what would you choose? Many people like the “sure thing”.
This is despite the fact that if you calculate the “expected value” of the two options – by multiplying the amount ($40) by the probability with which it will occur (80%) – the gamble is “worth” more ($32).
But such simple gambles seem rather unlike the choice Nik faced. Just like many of the more mundane decisions we commonly face, he did not have a convenient look-up table of relevant probabilities and unambiguous outcomes. We have to learn by experience about such outcomes and the probabilities with which they occur.
Imagine now that you are faced with a choice between two unlabelled buttons on a computer screen (a bit like a stripped-back, mini-poker machine). You learn after several clicks that clicking on one button always reveals a payment of $30 and clicking on the other reveals a payment of $40 eight times out of ten clicks and $0 two times out of ten.
If you were able to keep clicking for as long as you wanted, which button do you think you’d end up favouring?
You’ve probably noticed that both the “clicking task” and the described gamble are offering a choice between exactly the same options. Yet it turns out that when making “experience-based” choices – in which people learn about outcomes and their probabilities – preferences often reverse.
So, over time, people favour the gamble instead of the sure thing, perhaps reasoning that the two-in-ten chance of receiving nothing is overshadowed by the fairly regular receipt of $40 rather $30.
That is, people appear to underweight the rarer outcome (20% chance of nothing) when learning from experience, but they overweight the same outcome when the decision is described.
How reliable is gut feeling?
Does this explain why Nik stepped out on the wire? Obviously not entirely, but perhaps it sheds some light: the “gamble” Nik faced could not be described – no-one had ever crossed the Grand Canyon on a tightrope before so the probability of a successful outcome could not be calculated.
Nik thus had to rely on his experience in similar situations. Perhaps his many previous successes in other tightrope walks led him to downplay (underweight?) the risk of the (severely) negative outcome.
Nik’s decision to cross the canyon appears – in hindsight – to count as a “good one”: he lived to tell the tale. In this instance, prior experience was a good source of information, but is that always the case?
Often we are urged to “go with our gut” or rely on “intuition” when faced with tough decisions. Such advice is likely to be good if by intuition we mean the experience we’ve built up over hours, days, weeks or even years of exposure to similar situations – or even just several trials of a “clicking task” experiment.
But if we haven’t had the opportunities to learn from our experience – either through lack of exposure (the first time you’ve been faced with this particular choice) or because the environment is just too unpredictable (the stock market, for example) – then our decision-making can go awry.
It is in these situations that heuristic processes often come to the fore and can lead to characteristic biases in our judgements and decisions.
Simple experiments with gambles reinforce the notion that the way in which we acquire information can result in large differences in what we choose to do, and in the way we interpret the world around us.
So next time you are making a decision and trying to work out if it is a good one, take a step back and think about how well qualified you are to make it. Do you have sufficient experience or might you be swayed by potentially irrelevant aspects, like the way someone has chosen to describe the options to you?
If you are not sure, then research suggests that simple strategies like considering why your first answer/choice might be wrong, and then generating an alternative, can reduce the impact of unwanted biases.
Just like crossing the Grand Canyon on a tightrope, the path to good decision-making needs to be followed one careful step at a time.
Click on the links below for other articles in the series, How we make decisions:
How to help take control of your brain and make better decisions
Fair Call? What sport can show us about high-speed decisions
Editor’s note: Ben will be on hand for an Author Q&A session between 3 and 4pm ADST today (November 7). Post any questions about experience and decision making in the comments below.
Ben Newell receives funding from the Australian Research Council.
This article was originally published on The Conversation. Read the original article.
Leave a Reply