Risk Perception and the Psychological Dimensions of Disaster

In this lecture, we'll review an important part of the human side of the hazard equation. For a hazard to exist and for it to have the potential of creating a disaster, there has to be an intersection between some kind of extreme event (or possibility for one) and people and their property in its path.

So, how do people arrange themselves and their "stuff" in space so as to be at risk, particularly when there is information available on the potential for exposure from science or historical memory?

This takes us into the murky reaches of human psychology: Do we perceive a hazard? Do we cognize its relevance for us as individuals? Do we know about things we could do to minimize our exposure? What motivates us to act (or not act) on that knowledge? What do people actually do during an emergency?

Literature on these subjects comes from a wide variety of disciplines, ranging from environmental perception and behavioral geography through cultural anthropology and sociology into public health and psychology proper. A lot of it has developed in the context of technological hazards and lifestyle risks and then cross-fertilized with the natural hazards work.

A major finding is that lay perceptions differ markedly from expert perceptions, sometimes wildly overreacting to minor risks and other times blithely ignoring serious risks. A seminal work here was Kasperson et al. 1987. The social amplification of risk: A conceptual framework. Risk Analysis 8: 177-187.

Much effort has gone toward explaining these discrepancies between expert and lay perceptions.

Exasperated risk assessors sometimes state that the public is patently ignorant (the ignorance hypothesis): Its involvement in decisions concerning risks is, therefore, essentially valueless, if not counterproductive. Exasperation is understandable, but ours is supposed to be a democratic society, and you cannot leave the public out of the governance of risk. A book presenting this argument and trying to explain its sources is Mooney, Chris, and Kirschenbaum, Sheril. 2009. Unscientific America: How Scientific Illiteracy Threatens Our Future. New York: Basic Books.

Another line of inquiry has focused on the processes by which people process information in general (heuristics), as they apply to hazards. Findings suggest that the disparities between lay and expert perceptions are not completely random and unpredictable.

A potentially positive finding is that people's perceptions can become more accurate by dint of personal experience with a hazard that results in disaster.

Another finding relates to misconceptions of exposure. There is some evidence that mediations between toxic releases and health effects are poorly understood in lay perception.

Related to this is a finding that people often make up their minds about an issue before being exposed to an adequate array of facts and arguments about it. Such premature decision-making often involves taking the position of a reference group they trust. Once the decision is made, people then often become very confident in their opinions. We see this all the time in virtually any political issue, but it applies equally to risk perception.

In defense of the public, there has been some intriguing research suggesting that the public may not be in fact so irrational and ignorant; rather, laypeople may simply be judging hazards along multiple axes, not just the quantifiable probability of mortality or morbidity, on which experts focus. A pioneering scholar in this context, who usually concentrates on technological or health hazards more than natural hazards, is Paul Slovic and his team at Decision Research in Oregon. An early piece laying out the argument is Fischhoff, Baruch; Slovic, Paul; and Lichtenstein, Sarah. 1982. Lay foibles and expert fables in judgments about risk. The American Statistician 36, 3, Part 2: 240- 255. The current state of the literature on risk perception and its often puzzling connections with actual behavior is summarized in Wachinger, Gisela; Renn, Ortwin; Begg, Chloe; and Kuhlicke, Christian. 2013. The risk perception paradox -- implications for governance and communication of natural hazards. Risk Analysis 33, 6: 1049-1065. doi: 10.1111/j.1539-6924.2012.01942.x.

If the claim by some researchers (e.g., Johnson, Slovic, Shrader-Frechette, Margolis, Jasonoff) that the public, activist or uninvolved, may not be irrational is sound, one implication is that expert opinion may be flawed by too narrow a focus and, in its own way, as distorted as public (including activist) opinion. The dilemmas of scientific expertise in the minefields of political interests ... and the democratic oversight of risk and hazard ... are laid out in Jasonoff, Sheila. 2003. Accountability: (No?) accounting for expertise. Science and Public Policy 30, 3: 157-162.

Risk assessment scientists are not the only experts to suffer from faulty perceptions of public perceptions and behaviors in a disaster! The professional emergency management experts have often accepted and expended resources preparing for the infamous disaster myths (a recent evaluation of the pervasiveness of disaster myths among safety professionals and the general public is provided in Nogami, Tatsuya. 2018. Disaster myths among disaster response professionals and the source of such misconceptions. Journal of Contingencies and Crisis Management 26, 4: 491-498. doi: 10.1111/1468-5973.12218).

One of the major, enduring impacts of a disaster, both for the public and for first-responders, is psychological. The intense sudden mortal fear of a rapid onset disaster, the grinding and ratcheting up fear of a creeping disaster (think about your own dawning recognition that maybe COVID-19 is going to be a major disaster in your own personal life), the sight of grievously injured or dead victims, the loss of loved ones, survivor's guilt, the urgings of conscience to keep going and helping out far beyond physical endurance, the loss of things (homes, treasured possessions) that can never be exactly replaced -- all these induce a complex of psychological pain and perceptual and behavioral dysfunction that can endure for years and be retriggered by random weird things that remind you of the event.

Environmental perception, then, is something of a minefield for the hazards community. The public may well see things entirely differently from risk assessment scientists and crisis management practitioners, and that will affect their behavior before, during, and after a disaster. Many of the perceptual biases in disaster are rooted in media. The rôle of media in constructing hazard perception and representing disaster will be explored in a later lecture.

There may be long-term distortions of normal perception and behavior after a disaster, both in the public and in the professional emergency service personnel.

It is easy to be nonplussed by public perceptions and the perplexing behaviors they can set off (and to recognize that we, too, are subject to these issues), but it is something everyone in the hazards community needs to learn more about and then try to find effective ways of public education and risk communication that work with (and around) these oddities of perception. Successful crisis management may depend on it, not to mention effective disaster mitigation and preparation.

Another lecture will take up lessons learned in public education and risk communication.

==========
Dr. Rodrigue's Home |   Geography Home |   EMER Home |   ES&P Home
BeachBoard |   CSULB Home |   Library |   Bookstore
==========

Document maintained by Dr. Rodrigue
Last revision: 09/01/23

==========