Article

Rhetoric and Risk

Authors: , ,

Abstract

Keywords: Association for Rhetoric of Science and Technology

How to Cite: Schwartzman, R. , Ross, D. G. & Berube, D. M. (2011) “Rhetoric and Risk”, Poroi. 7(1). doi: https://doi.org/10.13008/2151-2957.1087

 Poroi, 7, 1, Schwartzman, Ross, and Berube Project on Rhetoric of Inquiry

Challenges in Rhetoric of Science and Technology: ARST Report

Rhetoric and Risk

Roy Schwartzman

Dept. of Communication Studies, University of North Carolina at Greensboro, Greensboro, NC USA

Derek G. Ross

Dept. of English, Auburn University, Auburn, AL USA

David M. Berube

Dept. of Communication, North Carolina State University, Raleigh, NC USA

Poroi, 7,1 (January 2011)

http://dx.doi.org/10.13008/2151-2957.1087

Demarcating the Domain of Rhetoric

The discoveries of science and technology are accelerating. The choice of how to regulate and react to scientific and technological innovations relies heavily on the notion of risk.  The emergence nature contemporary science and technology (i.e., complex systems that are not reducible to the simple physical and chemical processes from which they arose) confounds risk studies (Goodenough & Deacon, 2006).  Indeed, whether to embark on a particular path of scientific inquiry or proceed with a technological development depends on the ability to calculate the amount of risk associated with the endeavor.  We are, however, ill-equipped to resolve the demands of risk analysis with certainty.

The greater the negative risk, the greater our reticence to proceed. The very term “calculate,” however, invests risk with a far greater degree of objectivity and precision than actually present in the conduct of science or policymaking. Sandman’s (1993) famous definition of risk as “the sum of hazard plus outrage” positions emotion (albeit only one species of it) placed squarely alongside the calculus of threat probability--much as Aristotle, who brought emotion into the charmed circle of internally artful means of persuasion, declared rhetoric the counterpart (antistrophe) of dialectic. Research on the significance of affective forces in determining how people perceive risk (Slovic, 2000, 2010) creates ample opportunities for rhetorical studies to complement social scientific research on cognition of risk.

What Can Rhetorical Approaches to Risk Offer?

The field of communication is no stranger to the realm of risk, but most attention has focused on risk management, often approached as crisis communication (e.g., Venette, 2006). This area of study concentrates on how to package and present phenomena to audiences in ways that accomplish the rhetor’s intent, which is usually to steer audience perceptions in a particular direction or to protect the interests of stakeholders. Risk management thus operates within a compliance-gaining paradigm by focusing on how sources craft messages (Sellnow, Ulmer, Seeger, & Littlefield, 2009), prioritizing how risk is presented rather than how risk is constructed by various stakeholders. 

One important contribution that rhetorical analysis can make is to disaggregate the notion of “the public” as a unitary audience, usually contrasted with no less unitaryexperts,” and to examine instead rhetorical influences in how various publics recognize and react to risks (Berube, 2007). For example, more research needs to be conducted on how particular publics—bounded groups defined by demographic variables—interface with the realms of justification they invoke when making or assessing arguments. How do the modes of justification (what counts as proof, what qualifies as convincing) correlate with demographically delineated publics? Do certain classifications of people (socioeconomic status, educational level, etc.) characteristically resort to certain types of rhetorical devices, interpretive methods, evaluative criteria, or argumentative tactics? What implications might such findings have for bridging the communicative chasms that separate various stakeholders in scientific and technological ventures?

Rhetorical issues leap to the foreground most blatantly in science and technology when technical issues enter a public forum beyond the scope of scientific discourse. Since consideration of risk affects all stakeholders involved in an issue, discussions of risk automatically extend communication beyond the discursive domains dominated by scientific researchers, engineers, and other technicians. Research in the public understanding of science addresses the confluences and confusions between these discursive domains, guided in part by discrimination of discursive spheres and detailed consideration of how they bump up against each other (Goodnight, 1982). Rhetorical analysis can contribute to knowledge about several factors that problematize the communication of risk.

A long history of research indicates that scientific experts define and evaluate levels of risk differently from non-experts (e.g., Fischhoff, Watson, & Hope, 1984). These disjunctions can cause communication breakdowns between different stakeholders (e.g., researchers, product developers, policymakers, consumers) who operate under different conceptions of risk (Slovic, 1987). A more “rhetoricized” approach to risk might take up some of the following issues.

The Inaccessibility of Science

The technical nature of much scientific discourse may render it incomprehensible to audiences of non-scientists. Scientific discourse may remain inaccessible to non-specialists in at least three ways. First, prevalence of scientific jargon may restrict public access to the terminology needed for engagement in meaningful discussion with researchers. Second, much research deals with unobservable phenomena. Lacking proper technological tools to observe relevant images (for example, the visual configurations of nanoparticles visible only through enhanced images from electron microscopy), non-specialists may have to defer to the renditions produced by researchers, illustrators, and computer graphics artists (Landau et al., 2009; Lösch, 2006; Ruivenkamp & Rip, 2010). Those who control the imagery may unduly influence how phenomena are perceived. Third, many phenomena operate counterintuitively, as when nanoparticles behave quite differently in their quantum-mechanically governed worlds than macroparticles of the identical substance. If basic laws of the macro-world (e.g., toxicity levels, conductivity, and gravitational attraction) no longer operate in customary ways, and they do not in the nano-world, then fundamental assumptions such as the properties of chemical substances qualify as contestable. Furthermore, counterintuitive facts and processes challenge the notion that common sense can guide argumentation, since basic notions such as cause and effect become questionable or conditional. Such counterintuitive points further distance scientific discourse from non-scientific discursive realms. The issue is framed in its most naked form in the evidence for evolution.

Since this impenetrability may fuel confusion or suspicion that researchers are trying to hide risks, translating technical vocabulary into more accessible terms can improve trust between different groups of stakeholders (Kasperson & Stallen, 1991). Rhetorical scholarship can address the complexities and nuances of these translations. What is lost, gained, or altered in the process of translating concepts from one audience to another?  Notably, rhetoricians can approach translation as multi-directional—not a “dumbing down” of scientific discourse, but a search for common argumentative, affective, and conceptual ground that will improve how various stakeholder communities understand each other.

The Fact of Uncertainty and the Difficulty of Gaining Trust  

A stakeholder public often has difficulties establishing trust in relation to science-based argumentation (Hipkins et al., 2002).  While rhetoric can enable public understanding (Gross, 1994), the resulting understanding is always mediated by social context (Wynne, 1992, 1993; Lach & Sanford, 2010). It is also influenced by uncertainty, which is often the product, or even construction, of the media (Mellor, 2010; Zehr, 2000). Additionally, at the individual level, the media, as articulated by Kurt Neuwirth (2009), “are thought to influence intrapersonal (one’s own salience judgments) agendas, perceived community (what others in the community consider important) agendas, and interpersonal (actual discussion of issues) agendas” (p. 400). As a result, a “simple” act of demonstration-based proof (apodeixis)—proving climate change is occurring, proving humans create pollution, proving a definite degree of risk exists (or does not)—becomes an enormous and convoluted task that is immensely contingent upon a rhetor’s awareness of audience, situation, and sociopolitical context (deixis and pisteis), plus his or her skill in negotiating the shoals. In addition, consensus among scientists is rare. Expert disagreement may signify to non-experts that risks remain unknown. If this disagreement is attributed to scientific research per se, then risk levels may be judged as unknowable.

Finally, the determination to resolve uncertainty with more certainty may destine risk assessment and management to the dumpsters. Scientific research seldom increases certainty. Risk analysis demands decision making in situations of high uncertainty while the models we use are certainty-based (e.g., calculable probabilities, measurable exposure levels). Given the rate of change in science and technology, it is same to presume certainty will remain elusive and risk calculi premised on discrete values may need to be replaced. Given the deterministic ends associated with risk analysis this is one of the biggest challenges for risk science. How we package data and recommendations will be profoundly affected by the rhetorics of uncertainty.

Speculation and Sensationalism

Technical obfuscation represents only one potential source of miscommunication. Sweeping, dramatic, often minimally warranted claims may dismiss or exaggerate risks and divert attention from more nuanced risk assessment. For example, caricatures of nanotechnology as the solution to problems from the energy crisis to cancer treatments—often promulgated by investors and media outlets seeking remarkable, immediate results—may steal the discursive stage from the qualified, tentative, long-range claims made by researchers (Berube, 2006). What rhetorical resources might generate the interest of stakeholders while maintaining fidelity to the cautious, often long-range claims emerging from scientific research?

Insufficient Data in a Crisis

Emerging technologies may not be well known enough for the long-term effects of their actions/use to be quantitatively studied. By the time a study is completed, damage done by employed technologies may already be irreparable. Relatedly, “relevant information about environmental and health hazards often develops more rapidly than [agencies] are able to respond” (Tai, 2005, p. 1). The Deepwater Horizon disaster in the Gulf of Mexico illustrates this concern. The rapidly-progressing nature of the disaster meant that there was no time for quantitative assessment of measures. In crisis, decision makers must base their choices on what they know, what they can learn, and what their discourse communities can provide. They must do all of this while communicating the potential value of their decisions to publics that, ultimately, are called upon to trust experts and agencies to work in their best interests. Establishing such trust, however, is problematic.

Skepticism About Experts

When non-specialists do not have access to the specialized methods and technical vocabulary of scientists, they often are asked to trust the researchers and technicians who presumably would act as stewards for various stakeholder interests (Jenkins-Smith & Silva, 1998). This trust operates along several dimensions. Social trust encompasses the traditional realm of perceived source credibility adapted from Aristotle: expertise, character, and caring. Social trust applies to particular people or organizations. Lay publics, however, often view technical experts skeptically (Sjöberg, 2002), sometimes suspecting that these specialists are biased toward or in cahoots with special interests and product developers. Epistemic trust applies to science as an institution, and systemic distrust of science may lead to construing risks of scientific research and technological applications as disproportionately high (Sjöberg & Herber, 2008). Simply adding layers of expert testimony does not automatically quell fear about risks. Rhetorically speaking, over-reliance on experts may amplify suspicions about credibility, particularly among audiences who harbor low epistemic trust toward scientists or engineers. A more productive approach to risk communication would be to involve a broader range of testimonies and sources of information, thus avoiding pitting one contingent of stakeholders against another.

Further research should address how the ethos of science is constructed, presented, and challenged in ways that amplify or attenuate perceptions of risk. How and for which publics does science seem to embrace antagonistic goals and values? Under what conditions and for whom does science don the mantle of authority?

The Narrative Arcs of Risk Communication

The rhetorical trajectory of risk operates along synchronic and diachronic axes. Synchronically, the construction or perception of something as a risk emerges relative to concurrent risks. The degree of risk should be measured not simply as the product of

(statistical probability of occurrence) ´ (quantifiable magnitude of effect)

but at least partially in relationship to the various narratives of risks (Alcabes, 2009) that circulate via avenues such as mass media (Neuwirth, 2009). Even when probability and magnitude of risk are objectively measurable, this information—if known—does not sufficiently explain human behaviors that may correspond poorly to such calculations (Luhmann, 1993). Rhetorical investigation reveals the rationales underlying the social construction of risk (Luhmann, 1993), clarifying the means of persuasion whereby degrees of risk are understood.

Especially when the measurable degree of risk is unknown or poorly understood, as in the case of emergent technological innovations, risk assessment becomes a relative judgment reliant on heuristics that establish the comparative threat levels. Such judgments invoke factors such as personal values including religiosity (Ho, Scheufele, & Corley, in press), comportment of innovations with pre-existing beliefs (Gardner, 2008), past experiences, activation of emotions (Slovic et al., 2004, 2007), and perceived proximity to the risk-inducing phenomena. The understanding of risk arises as an intuitive construct holistically formed (Marshall et al., 2007) from the entire spectrum of variables that (a) establish what counts as risk; (b) assess the relative degree of risk; (c) determine what counts as an acceptable level of risk. Rhetorical analysis can shed important light on how the heuristic mechanisms for determining risk reinvigorate the study of argumentative fallacies—not simply as faulty arguments but as sense-making mechanisms employed when resources (e.g., access to data, criteria for evaluating source credibility, etc.) for more traditionally scientific modes of “valid” inductive and deductive argumentation are rejected, unavailable, or unsatisfactory.

Diachronically, rhetorical analysis could trace the course of how risks are constructed. One area ripe for further rhetorical harvest concerns the ways that risk scenarios discredited by scientific researchers persist and evolve into concerns that occupy other stakeholders.  For example, why do horror stories of killer nanobots still play a role in popular portrayals of nanotechnology long after being scientifically debunked and even disavowed by the author of the original scenario (Schwartzman & Carlone, 2008)? Another avenue of diachronic research might examine the cultural transmission of narratives about science as an institution, illuminating how the persona of the scientist evolves over time in ways that foster deference or dread toward scientific knowledge and thus affect epistemic trust (e.g., the scientist as savior vs. the mad scientist).

Audience Differences

A public’s difficulty in making decisions based on their understanding of scientific or technological information does not necessarily result from insufficient understanding or misconceptions of science itself. Risk-related information is often scientific in nature, and such discourse is both culturally and politically mediated (Graham & Lindeman, 2005). Research suggests that individuals create “landscapes” based on “intersubjective, taken-for-granted symbols” which are complex constructions based on their social groups and self-perceptions (Greider & Garkovich, 1994, p. 9). This socially constructed perception of risk-related activities is not based in ignorance (see work on the deficit model of public understanding of science, such as Bauer, Allum, & Miller, 2007; Sturgis & Allum, 2004; Maranta et al., 2003; Gross, 1994). Instead, this amalgam of intuitive approaches to risk assessment plays an important role in a public’s understanding of complex information (Allum et al., 2008; Sturgis & Allum, 2004; Gross, 1994).

Rather than dismiss these heuristics as cognitive distortions to be purged with a healthy emetic of scientific literacy, a rhetorical approach takes seriously the ways these decision-making tools are activated through communication and how they are deployed when people interface with scientifically justified knowledge claims. The issue may be less that of bringing laypeople up to speed with science and improving the accuracy of their perceptions (Wrench, 2007) than of developing deeper appreciation for the ways non-scientists formulate perceptions of science and technology beyond the constraints of logical argumentation and scientific method. For example, rhetorical analysis can probe heuristic alternatives to logical implicature, investigating modes of narrative or mythological implicature that constitute compelling plot lines, but do not qualify as authorized discourse by scientific standards.

References

Alcabes, P. (2009). Dread: How fear and fantasy have fueled epidemics from the black death to avian flu. New York: PublicAffairs.

Allum, N., Sturgis, P., Tabourazi, D., & Brunton-Smith, I. (2008). Public understanding of science cultures: A meta-analysis. Public Understanding of Science, 17, 35-54. http://dx.doi.org/10.1177/0963662506070159

Bauer, M. W., Allum, N., & Miller, S. (2007). What can we learn from 25 years of PUS survey research? Liberating and expanding the agenda. Public Understanding of Science, 16(1), 79-95. http://dx.doi.org/10.1177/0963662506071287

Berube, D. M. (2007). Rhetoric of “stakeholding.” In F. Allhoff, P. Lin, J. Moor, & J. Weckert (Eds.), Nanoethics: The ethical and social implications of nanotechnology (pp. 225-240). Hoboken, NJ: John Wiley.

Berube, D. M. (2006). Nano-hype: The truth behind the nanotechnology buzz. Amherst, NY: Prometheus.

Fischhoff, B., Watson, S. R., & Hope, C. (1984). Defining risk. Policy Sciences, 17, 123-139. http://dx.doi.org/10.1007/BF00146924

Gardner, D. (2008). Risk: The science and politics of fear. Toronto: McClelland and Stewart.

Goodenough, U., & Deacon, T. W. (2006). The sacred emergence of nature. In P. Clayton & Z. Simpson (Eds.), The Oxford handbook of religion and science (pp. 853-871). Oxford and New York: Oxford University Press.

Goodnight, G. T. (1982). The personal, technical, and public spheres of argument: A speculative inquiry into the art of public deliberation. Journal of the American Forensic Association, 18, 214-227.

Graham, M. B., & Lindeman, N. (2005). The rhetoric and politics of science in the case of the Missouri River system. Journal of Business and Technical Communication, 19, 422-448. http://dx.doi.org/10.1177/1050651905278311

Gross, A. (1994). The roles of rhetoric in the public understanding of science. Public Understanding of Science, 3, 3-23. http://dx.doi.org/10.1088/0963-6625/3/1/001

Greider, T. & Garkovich, S. (1994). Landscapes: The social construction of nature and the environment. Rural Sociology, 59(1), 1-24. http://dx.doi.org/10.1111/j.1549-0831.1994.tb00519.x

Hipkins, R., Stockwell, W., Bolstad, R., & Baker, R. (2002). Commonsense, trust, and science: How patterns of beliefs and attitudes to science pose challenges for effective communication. Ministry of Research, Science, and Technology, New Zealand Council for Education Research in association with ACNielsen.

Ho, S. S., Scheufele, D. A., & Corley, E. A. (in press). Value predispositions, mass media, and attitudes toward nanotechnology: The interplay of public and experts. Science Communication. doi: 10.1177/1075547010380386 http://dx.doi.org/10.1177/1075547010380386

Jenkins-Smith, H. C. & Silva, C. L. (1998). The role of risk perception and technical information in scientific debates over nuclear waste storage. Reliability Engineering and System Safety, 59, 107-122. http://dx.doi.org/10.1016/S0951-8320(97)00131-2

Kasperson, R. E., & Stallen, P. J. M. (1991). Communicating risks to the public: International perspectives. Dordrecht: Kluwer Academic.

Lach, D., & Sanford, S. (2010). Public understanding of science and technology embedded in complex institutional settings. Public Understanding of Science, 19(2), 130-146. http://dx.doi.org/10.1177/0963662508096783

Landau, J., Groscurth, C. R., Wright, L., & Condit, C. M. (2009). Visualizing nanotechnology: The impact of visual images on lay American audience associations with nanotechnology. Public Understanding of Science, 18(3), 325-337. doi:10.1177/0963662507080551 http://dx.doi.org/10.1177/0963662507080551

Lösch, A. (2006). Anticipating the futures of nanotechnology: Visionary images as means of communication. Technology Analysis and Strategic Management, 18(3/4), 393-409. doi:10.1080/09537320600777168 http://dx.doi.org/10.1080/09537320600777168

Luhmann, N. (1993). Risk: A sociological theory (R. Barrett, Trans.). New York: Aldine de Gruyter. http://dx.doi.org/10.1515/9783110870343

Maranta, A., Guggenheim, M., Gisler, P., & Pohl, C. (2003). The reality of experts and the imagined lay person. Acta Sociologica, 46(2), 150-165. http://dx.doi.org/10.1177/0001699303046002005

Marshall, R. D., Bryant, R. A., Amsel, L., Suh, E. J., Cook, J. M., & Neria, Y. (2007). The psychology of ongoing threat: Relative risk appraisal, the September 11 attacks, and terrorism-related fears. American Psychologist, 62(4), 304-316. doi: 10.1037/0003-066X.62.4.304 http://dx.doi.org/10.1037/0003-066X.62.4.304

Mellor, F. (2010). Negotiating uncertainty: Asteroids, risk, and the media. Public understanding of science, 19(1), 16-33. http://dx.doi.org/10.1177/0963662507087307

Neuwirth, K. (2009). Risk, crisis, and mediated communication. In R. L. Heath & H. D. O’Hare (Eds.). Handbook of Risk and Crisis Communication. (pp. 398-411). New York: Routledge.

Ruivenkamp, M., & Rip, A. (2010). Visualizing the invisible nanoscale study: Visualization practices in nanotechnology community of practice. Science Studies, 23(1), 3-36.

Sandman, P. M. (1993). Responding to community outrage: Strategies for effective risk communication. Fairfax, VA: American Industrial Hygiene Association.

Schwartzman, R., & Carlone, D. (2008). A rhetorical reconsideration of knowledge management: Discursive dynamics of nanotechnology risks. In A. Koohang, K. Harman, & J. Britz (Eds.), Knowledge management: Foundations and principles (pp. 1-39). Santa Rosa, CA: Informing Science Press.

Sellnow, T. L., Ulmer, R. R., Seeger, M. W., & Littlefield, R. (2009). Effective risk communication: A message-centered approach. New York: Springer. http://dx.doi.org/10.1007/978-0-387-79727-4

Sjöberg, L. (2002). The allegedly simple structure of experts’ risk perception: An urban legend in risk research, Science, Technology, & Human Values, 27, 443-459. http://dx.doi.org/10.1177/016224302236176

Sjöberg, L., & Herber, M. W. (2008) Too much trust in (social) trust? The importance of epistemic concerns and perceived antagonism. International Journal of Global Environmental Issues, 8(1/2), 30-44. http://dx.doi.org/10.1504/IJGENVI.2008.017258

Slovic, P. (2010). The feeling of risk: New perspectives on risk perception. London: Earthscan.

Slovic, P. (2000). The perception of risk. London: Earthscan.

Slovic, P. (1987, April 17). Perception of risk. Science, 236(4799), 280-285. http://dx.doi.org/10.1126/science.3563507

Slovic, P., Finucane, M. L., Peters, E., & MacGregor, D. G. (2007). The affect heuristic. European Journal of Operational Research, 177(3), 1333-1352. doi: 10.1016/j.ejor.2005.04.006 http://dx.doi.org/10.1016/j.ejor.2005.04.006

Slovic, P., Finucane, M. L., Peters, E., & MacGregor, D. G. (2004). Risk as analysis and risk as feelings: Some thoughts about affect, reason, risk, and rationality. Risk Analysis, 24(2), 1-12. http://dx.doi.org/10.1111/j.0272-4332.2004.00433.x

Sturgis, P. J., & Allum, N. C. (2004). Science in society: Re-evaluating the deficit model of public attitudes. Public Understanding of Science, 13(1), 55-74. http://dx.doi.org/10.1177/0963662504042690

Tai, S. (2005). Three asymmetries of informed environmental decisionmaking. Temple Law Review, 78(659), 1-79.

Venette, S. J. (2006). Special section introduction: Best practices in risk and crisis communication. Journal of Applied Communication Research, 34(3), 229-231. http://dx.doi.org/10.1080/00909880600769464

Wrench, J. S. (2007). The influence of perceived risk knowledge on risk communication. Communication Research Reports, 24(1), 63-70. http://dx.doi.org/10.1080/08824090601128182

Wynne, B. (1993). Public uptake of science: A case for institutional reflexivity. Public Understanding of Science, 2, 321-337. http://dx.doi.org/10.1088/0963-6625/2/4/003

Wynne, B. (1992). Misunderstood misunderstanding: Social identities and public uptake of science. Public Understanding of Science, 1, 281-304. http://dx.doi.org/10.1088/0963-6625/1/3/004

Zehr, S. (2000). Public representations of scientific uncertainty about global climate change. Public Understanding of Science, 9, 85-103. http://dx.doi.org/10.1088/0963-6625/9/2/301