Sway: Value Attribution and Diagnosis Bias


Authors Note: All frameworks have moved to a new home at Strategy Umwelt. Please join me at this new platform for a revised list of mental models, strategy frameworks and principles.


Sway outlines the fact that humans are often prone to some very irrational behaviour that can cloud decision making. My previous post touched on Loss Aversion and the Power of Commitment. This post covers the undercurrents of Value Attribution, and Diagnosis Bias.

Value Attribution

Put simply, Value Attribution describes our tendency to imbue people or objects with certain qualities based on perceived value, rather than on objective data.

A great example of this is the experiment run by the Washington Post featuring violinist Joshua Bell.

Will one of the nation's greatest violinists be noticed in a D.C. Metro stop during rush hour? Joshua Bell experimented for Gene Weingarten's story in The Washington Post: http://wpo.st/-vP (Video by John W. Poole).

From the book:

"Value attribution, after all, acts as a quick mental shortcut to determine what's worthy of our attention. When we encounter a new object, person, or situation, the value we assign to it shapes our further perception of it, whether it's our dismissal of a curiously inexpensive antique we find at a flea market or our admiration of a high-priced designer bag in a chic boutique. Imagine, for instance, stumbling upon a discarded armoire on the street. Do you use it for the rare treasure it might be? Or is your knee-jerk reaction that something must be wrong with it? In the same way, value attribution affects our perceptions of people. We may turn down a pitch or idea that is presented by the "wrong" person or blindly follow the advice of someone who is highly regarded".

A great example outlined is on the Coney Island Boardwalk in 1916. A Polish immigrant named Nathan Handwerker decided to open a Hot Dog stand. In order to build up his customer base, he decided to undercut the competition by offering his dogs for half their price.

This did the opposite of intended; he ended up getting no customers at all. He then tried offering freebies like pickles or root beer, but this only exacerbated the problem further.

Then he came up with a brilliant idea that skyrocketed his business. He paid doctors from a nearby hospital to stand around his stand in their smocks enjoying a Hot Dog. This immediately caused people to view the stand in a different light.

The Danger of Value Attribution

Where this undercurrent gets dangerous is in ongoing engagements. Once we attribute a certain value to something, it dramatically alters our perception of subsequent information. Not only that, it affects us even when the value is assigned completely arbitrarily.

So once we have attributed value to something, it is very difficult to view it in any other light. And this has the power to derail objective, professional judgement. 

Here's an example from the book that outlines this. Researchers did a test on a fake beverage called SoBe that claimed to enhance intelligence.

A control group was given no SoBe. A next group was given SoBe, and asked to sign a form charging the University for $2.89 (we dub them the "fancy group"). Lastly, a group was given SoBe, but the form they signed only charged the University $0.89 (we dub them the "cheap group").

Next all three conducted a University test. The results were startling; the fancy group performed much better than the control - but the cheap group did the worst!

The conclusion here is that the value the students attributed to the SoBe drink affected their test scores, not the beverage itself.

Diagnosis Bias

Our next undercurrent is Diagnosis Bias. This refers to our propensity to label people, ideas or objects based on our initial opinions of them - and our inability to reconsider these judgements once we have made them. We essentially put on 'blinders', no matter how much evidence contradicts our diagnosis.

Human beings tend to not be able to stay neutral for very long, which makes this even more susceptible.

From the book:

"Each day we are bombarded with so much information that if we had no way to filter it, we'd be unable to function. Psychologist Franz Epting, an expert on understanding how people construct meaning in their experiences, explained, "We use diagnostic labels to organise and simplify. But any classification that you come up with," cautioned Epting, "has got to work by ignoring a lot of other things - with the hope that the things you are ignoring don't make a difference. And that's where the rub is. Once you get a label in mind, you don't notice things that don't fit within the categories that do make a difference".

Again, due to the way our brain tries to find mental shortcuts, we stop seeing what is clearly in front of our faces in favour of the label itself.

There are three key 'diagnosis traps' that we employ when Diagnosis Bias kicks in, outlined below.

Diagnosis Trap 1: Dismissal of the Facts

Here's an example from the book.

An MIT class arrived for their lecture. A teacher came in to tell them their professor was out of town, but before they packed up their books, a substitute would be arriving to fill in. He then handed out a bio for the teacher.

What the students didn't know was that they were given two separate versions, with only a tiny minor change. The last sentence for half the group stated "People who know his consider him to be a very warm person, industrious, critical, practical, and determined", while the other half's last sentence stated "People who know his consider him to be a rather cold person, industrious, critical, practical, and determined"

At the close of the session, they were handed a questionnaire to rate the substitute. The majority of the "very warm" group loved the teacher with high praise, while the majority of the "rather cold" did the exact opposite. One word made the students set a higher or lower opinion of him, irrespective of performance.

Diagnosis Trap 2: Credence to Irrelevant Factors

A couple of examples really highlighted this trap in the book.

Firstly, a South African Consumer Lending Bank tried a split test on a home loan letter to potential customers. The test trialled different interest rates, comparison of competitors, a giveaway (ten cell phones up for grabs) or a man or woman's smiling face. Rationality would suggest interest rates should win out (this is a home loan after all), but the real winner were if a man received a picture of a smiling woman. The one item that offered no rational benefit.

The second is in basketball. Through analysing the huge data sets researchers were able to determine that ultimately your perception as a player came down to one thing - your position in the draft. It didn't matter if you had equal or slightly better performance later in your career, earlier draft picked players would have longer playing time, be less likely to be traded, and have longer careers. The initial bias would hold sway.

Diagnosis Trap 3: Labelling

The last trap is even more frightening. The process of diagnosis often involves labelling people. When we brand or label people, they can often take on the characteristics of the diagnosis (a self perpetuating cycle). When positive, it is knows as the Pygmalion effect. Negative the Golem effect. As a catchall, it is known as the Chameleon effect.

Taking on characteristics assigned to us ends up reinforcing or reaffirming the label. We can't help applying this mirror, and we have a huge tendency to ignore objective data that contradicts our diagnosis.

A big example in the book is Bipolar Disorder in children. Through research studies, doctors determined that Selective Serotonin Re-uptake Inhibitors (SSRI's) like Prozac or Zoloft actually are no more effective than placebos in making patients feel better - they have the same therapeutic effect. Often it is more a therapists ability to connect with the patient that works - yet doctors continue to prescribe the drugs, with the risk of serious side effects.

Worse, once labelled bipolar, the child starts to take on characteristics of the condition. They start to fit themselves into the mould created by the diagnosis, which can cause issues in treating the problem - the self perpetuating cycle kicks in.

Our psychology and physiology are inextricably connected in extremely complex ways.

How To Avoid Them

When it comes to psychological undercurrents, the best ways to counter them is often to avoid following our natural instincts. This can be hugely hard, so its good to keep these strategies in mind.

Avoiding Value Attribution and Diagnosis Bias

The best strategy to employ here is to be mindful and observe things for what they are, not what they appear to be. Accept that your initial impressions could be wrong.

From the book:

"Whether we're shopping at a clearance outlet or a chic boutique, we sometimes need to fight our tendency to consciously dismiss an item because of its price. Instead, we should ask ourselves, "If I got this item as a gift, would I like it? If it cost $1 - or $1,000 - how would my perception of it shift?" Th more we become aware of the factors affecting the perceived value of a person or object, the less likely we are to be swayed by value attribution."

A good checklist is to enact what we call "propositional thinking". This includes:

  • Keep evaluations tentative
  • Become comfortable with complex, contradictory information
  • Consider problems from different angles
  • Introduce a self-imposed "waiting period" before making diagnostic judgements

Grab the book for some interesting case studies and research that delves deeper into this topic.


This post continues my series on Mental Models.