Problem Framing & Risk Preference
A few years back, the health care Preventative Services Task Force announced that it would no longer recommend routine mammograms for women between the ages of 40 and 49, a group that accounts for about one out of six breast cancers. “The recommendation is based on data that found that mammograms do reduce the risk of death in these women, but not enough deaths to recommend that all women 40 to 49 should be screened.” (Washington Post, Thursday November 19, 2009, p. A27)
That sounds risky! Breast cancer is the leading cause of cancer death in women ages 40 to 49, with more than 4,000 deaths expected in this age group the year this decision was published. The task force said that routine mammography would reduce deaths by about 15 percent. “However,” the panel concluded, “the harms associated with mammography outweigh its benefits in this age group.”
The panel members judged that the risk to women in this age group is acceptable. Many people disagree, even though they are looking at the same scientific data. They may simply have a different preference for risk – or they may be framing the problem differently.
Risk preference may be a personal thing (and one could talk at length about how to effectively graph an individuals risk preferences), but a decision-maker’s risk preferences can change depending on the way the decision problem is posed – that is on the “frame” in which the choice is presented.
Consider the following choice:
The United States is preparing for an outbreak of an unusual Asian strain of influenza. Experts expect 600 people to die from the disease. Two programs are available that could be used to combat the disease, but because of limited resources only one can be implemented.
Program A (Tried and True): 400 people will be saved
Program B (Experimental): There is an 80% chance that 600 people will be saved and a 20% chance that no one will be saved.
Which of the two programs would you prefer? Now consider the following two programs:
Program C (Tried and True): 200 people will die
Program D (Experimental): There is a 20% chance that 600 people will die, and an 80% chance that no one will die.
Would you prefer C or D?
You may have noticed that Programs A and C are exactly the same, as are Programs B and D. Many people prefer A on the one hand (sure thing), but D on the other (roll the dice). This is because people tend to be risk-averse when presented a choice in terms of gains (lives saved), but the same people in the same situation tend to be risk-seeking when the choice is presented in terms of losses (lives lost).
The s-shaped curve below represents the asymmetric behavior.
Daniel Kahneman and Amos Tversky discovered this “framing effect”, calling it Prospect Theory. Developed over a thirty year period, it is highly important today in economics and especially in financial economics. In 2002, Daniel Kahneman shared the Nobel Prize in Economics for this theory, but unfortunately Amos Tversky had died by the time and did not get his share of the fame.
Would the health care Preventative Services panel have made a different recommendation on mammography if the facts about lives lost and lives saved had been framed differently? We don’t know. But each of us can be aware of how we frame our alternatives to decision-makers, and then check to make sure any decisions made are not biased by the framing effect.
Reference: Making Hard Decisions by Robert T. Clemen