Wednesday, March 23, 2011

Risk Communication Issues

Communication is intrinsic to risk management, yet it’s all too easy to forget to adequately communicate the results of that analysis. This is especially true for complex issues such as terrorism, natural disasters or national security where we require specialist knowledge to understand the issues in any depth. There are a couple of very simple things that we can do however to improve our risk communication.

It’s beyond the scope of this article to cover all the elements of risk communication but it’s worth singling out at least one critical element of risk communication: how we as risk professionals communicate the nature of risks to our leaders, laypersons and the general public.

“Badly” is unfortunately often the way in which we communicate risks.  Consider if you will, that most people are more afraid of terrorism than driving yet as the United States statistics show, an average of 100 Americans are killed each year in terrorism related events while 40,000 to 45,000 Americans are killed on the roads in the same period.  Somewhere between 50,000 and 100,000 Americans will die each year in hospital from documented and preventable medical errors [i] while roughly 400,000 die annually from tobacco related illnesses. Despite this, both the level of fear and expenditure of funds to redress these risks are broadly speaking inversely proportional to the actual consequences.  Clearly, given that these statistics are relatively consistent across most of the developed nations effective risk communication is not one of humankind’s strong points.

We are not going to solve global risk management issues with a wave of the magic wand but there are some things that we can do.   We have any number of options available to us including one on one conversations, meetings, emails, newsletters and mass media.   The issue however is not how to communicate but rather what to communicate.  The key challenge lies in the way our brains are programmed to consider risks.  Our brains are finely tuned instruments for assessing immediate fight or flight risks but our ability to consider more complex risks is a relatively recent invention of the mammalian neo-cortex.

Large numbers and abstract ideas are accordingly not what we do best.  Saying that next year 40,000 out of 300 million people will probably die on the roads while 19,000 will be murdered and the average deaths from terrorism are 100 people per year simply doesn’t register in any meaningful way for us.  The numbers are simply too large and too abstract for us to really comprehend. A better way to present complex risk information is to break it down into natural frequencies.

To illustrate this concept, let’s examine a potentially fatal risk for which we have some existing data and research available.  Imagine that you are responsible for publishing public health risk information for counselors and Doctors and have to produce a leaflet for patients who are about to undertake an HIV test. By way of background, I should add that false positives are not uncommon. When an HIV test produces a positive result, the blood sample is therefore normally retested once or twice in the lab to verify the result.  Despite this additional testing, a small number of cases (roughly 0.01%) can still yield false positives (or false negatives) for a variety of reasons including medical conditions, accidental swapping of blood samples and data input error.  Most HIV information does not mention this seemingly minor false positive rate and a study of 21 HIV/AIDS information leaflets in America found that precisely none of the leaflets mentioned even the possibility of a false positive. [ii]

America is not alone in this oversight and another example of poor risk communication was confirmed in a 1998 German study of pre-test counseling for HIV tests. [iii] Twenty counselors were assessed and although they were very knowledgeable about most aspects of the topic, they exhibited significant gaps in the interpretation of tests.  Of the 20 counselors in the study who gave pre-test counseling to a client with no known risk behavior (eg: homosexual, IV drug user), 5 incorrectly claimed that false negatives never occur and 16 incorrectly claimed that false positives never occur.  The reasons for this inaccurate information included poor risk communication in their training, the illusion of certainty in testing and a failure to understand that the proportion of false positives is highest in low risk patients.

==========================

[1] Kohn, L. T., Corrigan, J. M., and Donaldson, M. S. (2000), To err is human: Building a safer health system, DCS National Academy Press, Washington, DC, USA
[2] Reported in Gigerenzer, Gerd (2002), Calculated Risks, Simon & Schuster, New York, USA
[3] Gigerenzer, Gerd, Hoffrage, Ulrich and Ebert, A. (1998), Aids counseling for low risk patients, Aids Care, 10, 197 – 211

No comments:

Post a Comment