Uh Oh Nano, Part 2.
(The following essay was written 2 years ago. It is being posted now in connection with research released this week in Nature Nanotechnology showing that people's cultural self-identification also contributes to their perceptions of risk, in this case the risks that may arise from nanotechnology. For a summary of that research and a citation to the original journal article, see http://www.sciencedaily.com/releases/2008/12/081207133749.htm. For an earlier post on nanotechnology, see the second post at http://onrisk.blogspot.com/search?updated-min=2007-01-01T00%3A00%3A00-05%3A00&updated-max=2008-01-01T00%3A00%3A00-05%3A00&max-results=
Nanotechnology, the capacity to manipulate materials at atomic and molecular scales, holds promise bounded only by the human imagination. But if this promise is to be fully realized in ways that respect potential harms to human and environmental health and safety, serious and ongoing consideration needs to be given to the way the public will react to nanotechnology and all its specific applications. For despite it’s benefits, if public apprehension builds, the potential of nanotechnology could be severely limited. The findings of the field of research known as risk perception offer valuable insights into what that public reaction might be. Understanding those potential reactions will allow proponents of nanotechnology to respect public concerns and address them through more effective risk communication, and advance the prospects of the entire field.
The Greek Stoic philosopher Epictetus observed “People are disturbed, not by things, but by the view they take of them.” Indeed, researchers including Paul Slovic, Baruch Fischoff, Sarah Lichtenstein, and others, have found that risks seem to have shared characteristics which, quite apart from the scientific facts and statistical probabilities, play a key role in making us more or less afraid. These affective/emotional characteristics are a fundamental part of how we frame our worries. They essentially form the subconscious backdrop by which we “decide” what to be afraid of and how afraid to be.
Perhaps the most important of these characteristics is the matter of trust. As leaders in the field of risk perception and risk communication have found, the more we trust – the less we fear. And the less we trust – the more afraid we are likely to be. Trust will almost certainly play a key role in the acceptance of, or resistance to, nanotechnologies.
Trust is determined by many factors. It is determined in part by who does the communicating. The facts being presented could be the same, but the trustworthiness of the communicator will help determine how worried the audience will feel. For example, people who learn about nanotechnology from the chemical industry, which is less trusted according to many polls, are more likely to worry than people who learn about it from health care professionals like doctors or nurses, more trusted professions.
Trust is also established through honesty. BSE affords a good example. In Japan, after the first indigenous infected cow was found, the government promised there would be no more. A second cow was found just days later. The government then said they had created a ban on feeding ruminant protein back to healthy cows – which is how the disease spreads – only to have the press learn and report that the “ban” was only voluntary. The press also reported that the government had kept secret an EU report rating Japan at high risk for BSE. These less-than-honest statements by the government badly damaged trust and fueled much greater fear in Japan than in Germany, where, within about a month of the discovery of the first indigenous infected cow, two cabinet ministers were sacked and changes were proposed to make agricultural practices more natural and less mechanical. Beef sales rebounded in Germany quickly, unlike Japan, partly because of the different degrees of honesty on the part of the government.
Trust also grows from an organization’s actions. Again BSE affords an example. In Canada and the U.S., after the first infected cows were found, the governments were able to point out that they had long ago instituted a feed ban and other restrictions to keep the risk low. As much as citizens might not have trusted the government in general, in this matter the responsible agencies had demonstrated their competence in keeping the risk low. This demonstration of competence probably played a role in the relatively minimal impact on consumer beef sales experienced within each country.
Trust is also established when an organization respects the reality of the public’s fears, even though there may be no scientific basis for those fears. In the U.S., authorities withdrew muscle meat from the market that had come from the slaughterhouse that processed the BSE-infected cow. This despite the scientific consensus that muscle meat is not a vector for BSE. The U.S. Department of Agriculture said it was acting “out of an abundance of caution…”. In other words, they were acknowledging and responding to the reality of the public’s fear and doing something that did not reduce the physical risk, but reduced public apprehension. The implicit message in such actions is that the agency was not being defensive, but was being responsive to public concern. That kind of action encourages people to trust such an agency, more than when the message is “There are no scientific reasons for your fears. The facts as we see them says there is no risk. So we are not going to act.” This potentially damaging message has already been heard from a small number of scientists in the area of nanotechnology.
Trust is also established by sharing control. In the case of nanotechnologies, this could include shared control over the writing of public health and environmental risk regulations, or shared control over the development of societal and ethical guidelines. The more that people feel they have some control over their own health and future, the less afraid they will be. The same risk will evoke more worry if people feel they have less control.
Trust is also built by openness. In the development of nanotechnologies, this should include dialogue with various stakeholders, a fully open exchange of scientific data, open government regulatory development, and open discussion of societal and ethical issues, among other areas. The more that people feel they are being deceived, lied to, or manipulated, the more afraid of a risk they are likely to be. Openness reassures them that they can know what they need to know to keep themselves safe. An open process is inherently trust-building.
Trust will be difficult to establish as nanotechnologies develop, because the driving forces behind such development will be principally commercial, industrial, corporate, and government, and the politically and profit-driven sectors of society are, de facto, perceived to be out for their own good more than they are out to serve the common good. So special attention and effort must be paid to establishing trust in everything that a government, a business, or a scientist does while working on commercial nanotechnological research, development or application.
But there are other risk perception characteristics that could bear on public acceptance of, or resistance to, nanotechnologies.
People tend to be more afraid of risks that are human-made than risks that are natural. Nano is almost certainly going to perceived as a human-made technology.
We tend to worry more about risks like nanotechnology that are hard to comprehend because they are scientifically complex, invisible, and not yet completely studied and understood. This could well invoke calls for a stringent application of the Precautionary Principle, and proponents of nanotechnology would be well advised to give serious consideration to such calls until a reasonable amount of safety data is developed.
We tend to worry more about risks that are imposed on us than risks we knowingly choose to take. Nanotechnologies will provide finished materials in some cases, which could appear in the label of a product to alert the consumer and give them choice. But in many cases, nano substances will serve as intermediates or raw materials or catalysts, substances which can not be labeled, and which therefore could evoke concern because people are going to be exposed to them without any choice.
We worry more about risks that are new than the same risk after we’ve lived with it for a while. While carbon black and some nano materials have been around for a while, many nano materials and products are new, with different behavioral characteristics than anything we’ve ever known. And of course the precise ability to manipulate things on a nano scale is new. This too could feed greater public apprehension about this technology.
And we tend to worry more about risks from which we personally get less benefit, and vice versa. For some nanotechnologies, for some people, the personal benefits may well outweigh the risks. But when they don’t – or when the benefits principally accrue to someone else - fear and resistance could rise.
It is important to respect the reality and the fundamental roots of these perception factors. They can not be manipulated away or circumvented with a clever press release, website, or a few open public meetings and dialogue. Human biology has found that the brain is constructed in such a way that external information is sent to the subcortical organs that generate a fear response before that information gets to the part of the brain that reasons and thinks “rationally”. In short, we fear first and think second. No press release can undo that biology. It is quite likely that, because of some of the characteristics of nanotechnology listed above (human-made, hard to understand, imposed, new, lack of trust in industry), that if the first way people hear about it involves some hint of threat or negativity, that the initial reaction many people have will be worry and concern.
Moreover, the brain is constructed such that circuits stimulating a ‘fear’ response are more numerous than those bringing rationality and reason into the cognitive process of risk perception. In short, not only do we fear first and think second, we fear more, and think less.
Again, this suggests the likelihood that first impressions many people will have of nanotechnology will be predominated by caution and concern.
Fortunately, both the biology and psychology of risk perception have been fairly well characterized. Insights from those fields can guide the design of research to find out how people are likely to react to nanotechnology as it becomes more common, is introduced into their lives, and as it gets more and more attention in the press, a trend already beginning in many places. Research that understands how people are likely to react is the first step toward designing risk management strategies, including risk communication, to address public concerns.
It is imperative that such research be done soon, so it can be used to develop risk management and risk communication strategies that will maximize public understanding of nanotechnology, and public participation in the process of its development and implementation. With these steps, the potentials of this remarkable field can be more fully realized while respecting public concerns and insuring public and environmental health and safety.
No comments:
Post a Comment