Article

Controversy, Conflict, and Conflicting Expertises: Report from the 2011 ARST Pre-Conference at NCA

Authors: , ,

Abstract

Keywords: ARST

How to Cite: Mehta, A. , Madjik, Z. P. & Platt, C. A. (2012) “Controversy, Conflict, and Conflicting Expertises: Report from the 2011 ARST Pre-Conference at NCA”, Poroi. 8(1). doi: https://doi.org/10.13008/2151-2957.1115

Poroi, 8, 1, Mehta, Majdik, and Platt Project on Rhetoric of Inquiry

 Communication: Inquiry, Energy, and Risk - ARST Reports

Controversy, Conflict, and Conflicting Expertise

Aalok Mehta

Annenberg School For Communication And Journalism, University Of Southern California

Los Angeles, CA USA

Zoltan P. Majdik

Dept. of Communication, North Dakota State University

Fargo, ND USA

Carrie Anne Platt

Dept. of Communication, North Dakota State University

Fargo, ND USA

Poroi 8, 1 (April 2012)

http://dx.doi.org/10.13008/2151-2957.1115

In this report, we summarize several projects exploring how the public develops and deploys resistance to scientific and technical expertise.

Majdik and Platt’s “Fight or Flight: X-Ray Backscatter Scanning and the Question of Expertise” explores public reactions to the deployment of a new airport scanning technology. Experts had repeatedly assured the public that x-ray backscatter machines were safe, but their rollout resulted in sudden, vehement, and strong public counter-reactions. Notably, these protests occurred even though highly invasive security scanning had become a staple of air travel since 9/11. The authors investigate what this controversy meant for our understanding, use, and practice of expertise (see also Majdik & Keith, 2011a, 2011b).

Whereas Majdik and Platt address a controversy about competing visions of expertise, Mehta's “Precaution Ascendant: Implications of Precautionary Rhetoric on Public Understanding of Science at the Local and National Scales” focuses on rhetorical strategies that leave no role at all, even in theory, for expertise. His analysis of a siting controversy in Los Angeles—the placement of a new subway station beneath one of the city’s densest streets—reveals a novel set of arguments used by Beverly Hills residents to advocate against a placement that would involve minor tunneling under a local high school.

Controversies over issues of expertise develop in situations where a complex topos becomes deployed on both sides of an argument. In the backscatter machine issue, the topos of ‘safety’ was openly polysemous, as both sides made use of the term’s multiple meanings. Publics used “safety” as a commonplace around which to construct arguments against the machines. They argued, for example, that the radiation emitted from these scanners may be harmful (or that the opposite—lack of harm—hadn’t been proven adequately); they also argued that the new scanning technology did little to ensure the safety of aircraft operations. The Department of Homeland Security and Transportation Security Agency, meanwhile, justified the use of their machines through the same central topos of safety, by asserting that the machines ensure the safety of aircraft operations and that x-ray scans are safe for travelers.

Safety was also a commonplace in the Beverly Hills subway controversy. Discussions became heated when in hearings with transportation planners and professionals, residents argued that they needed absolute guarantees, not only of physical safety but also of silence, lack of vibration, and other existing aspects of the environment. In other words, Beverly Hills residents equated “safety” with a lack of disturbance. Debate thus revolved around defining and then deploying the topos of ‘safety’ either narrowly, via scientific reasoning, engineering, and planning, or more broadly.

While clashes over expertise are fought about substantive issues, they also become waged—sometimes exclusively—over how and where claims to expertise ought to be legitimized and grounded. Conflicts over expertise tend to focus on stasis over two dimensions of risk: empirical danger and the acceptability of danger. Empirical danger refers to calculable likelihood of harm, acceptability of danger to the willingness of the people exposed to continue an activity despite such risks (along multiple types of cost: physical, financial, political, and cultural) (Douglas & Wildavsky, 1983). Thus, the conditions of risk around which questions of expertise are contested have a technical and moral dimension. The use of ‘safety’ in both cases speaks to this division, as ‘safety’ implies both a calculable dimension of danger and a normative dimension of how acceptable that danger is in particular situations to constituencies affected by them.

This difference leads to two distinct rhetorical attempts to assert where expertise lies. Arguments for locating expertise are either asserted through epistemic registers focused on the empirical aspects of risk (“we know best what’s best for you, generally”) or through appeals to phronetic practice (reasoning through practical wisdom) focused on the moral and social dimension of risk (“we know best what’s best and most acceptable for us, in this particular situation”). We find both of these approaches in the controversies we discuss. Advocates of the current systems—x-ray backscatter machines, subway expansion tunnel plans—appropriated the scope and defined the meaning of the ‘safety’ topos along epistemic registers that exist relative to established standards and accreditations. But experts did not have a monopoly on expertise.  Proponents defined the topos along phronetic registers that sought to limit the use of ‘safety’ to the particularities of the situation, audience, and practices.

Struggles over expertise manifest themselves rhetorically in several ways. One is an invocation of conceptual systems versus material practices. We find, for example, in the x-ray backscatter study multiple instances where an assertion of expertise relative to epistemic registers is made by reference to conceptual systems (e.g., emphases on the statistical likelihood of “glitches” over normal functioning, or comparative analogies to other types of generally held knowledge or generally practiced activities). At the same time, those who argue for a kind of expertise that resides in phronetic practices refer to material practices (e.g., emphases on the physically or otherwise tangibly experienceable implications of practices within existing systems/plans, often reinforced by metaphors like “cancer cluster” pointing to the physical manifestations of invisible radiation energy). Relatedly, we find in this latter category a distinct concern with questions of visibility. The lack of use of dosimeters by TSA agents in the so-called ‘cancer cluster’ at Logan led agents to express concerns because of their resultant inability to ‘see,’ and thus be able to prove or disprove through demonstration-in-practice, the material existence of empirical danger.

The phronetic approach to expertise therefore involves the idea that, while there are no universally ‘right’ or ‘wrong’ uses of expertise in a given situation or who can be assumed to possess it, there are better and worse uses of expertise. Appeals to this type of expertise are characterized by a grounding of claims in material uses and practices--a making-available of materially grounded arguments to deliberations about the locus of expertise. Conversely, the same argumentative mechanisms are denied to the opponents of x-ray scanners. Epistemically oriented expert systems counter claims with conceptual ideas, focused on an ideal norm (as it exists within epistemic considerations of the case) that invalidate or minimize the importance of the opponents’ concerns.

We find similar rhetorical characteristics in the Beverly Hills case, albeit with a different and opposite critical outcome. In this case, opponents couched their claims in highly precautionary terms that assess placement of the tunnel only in terms of worst cases (Ewald, 2001). This criterion allowed the opponents to suggest that its placement might make the high school a prime target for terrorist attacks, despite the large amount of earth buffering the school from the proposed tunnel and the nearby presence of more populated targets such as universities, business districts, and residential towers (Wilen, 2010). This line of argument assigns infinite, exclusive, and equal values both to the lives of students and to their educational experience, by implying that even minor noise disruptions to the school during a brief period of construction would irreparably damage students and that there is no conceivable set of benefits that could justify such disruptions. By equivocating such claims with claims about physical safety, opponents could close off any considerations of other benefits from a subway expansion to populations beyond the Beverly Hills schools. In doing so, residents of Beverly Hills eliminated any role for expertise, by suggesting that possibility, not probability, is the key issue in siting debates. Remarkably, this occurred even in a situation that revolved around scientific and technical experts—for instance, environmental scientists assessing soil conditions and engineers outlining the refinement and safety of tunnel boring technology.

Precautionary arguments, Mehta argues, allowed residents of Beverly Hills to voice strong and virtually unified opposition to the subway placement while avoiding some claims of NIMBY (Not In My BackYard)-ism, because it allowed them to use the language of science without actually allowing a contributory role for scientists. However, such reasoning is particularly problematic when applied to what are fundamentally local and temporally limited problems, since precaution arises from, and is largely a reaction to extended, global problems such as nuclear proliferation or climate change (Wingspread Conference, 1998). In addition to being highly self-contradictory, such logic would also be highly corrosive to any debate involving scientific expertise—essentially, all public debates—if allowed to spread, because it would create a perverse disincentive to hide information and would strongly disfavor open hearings such as those conducted by transportation and city planners.

In sum, looking at the two cases, we can see something about the rhetorical functions of expertise in the public space. Namely, we find a change in understanding expertise from a concept that exists only in reference to specialized knowledge to a concept that exists and is demonstrated in actual, particular, and situation-specific use and practice. This is demonstrated vividly in instances where danger is invisible--such as those involving energy (x-rays) or future plans (in political, deliberative settings). Managing the risk (thus, the acceptability) of such dangers requires either trust in expert systems, where expertise exists relative to epistemic concepts (similar to how Giddens, (1991, pp. 10–34) discusses it), or a rhetorical making-visible of dangers by embedding them or by making them available in concrete uses and practices.

References

Douglas, M., & Wildavsky, A. (1983). Risk and culture: An essay on the selection of technical and environmental dangers. Berkeley and Los Angeles: University of California Press.

Ewald, F. (2001). The return of Descartes' malicious demon: An outline of a philosophy of precaution (pp. 273-301). In Embracing risk, Tom Baker and Jonathan Simon (Eds). Chicago, IL: Chicago University Press.

Giddens, A. (1991). Modernity and self-identity: Self and society in the late modern age. Stanford, CA.: Stanford University Press.

Majdik, Z. P., & Keith, W. M. (2011a). The problem of pluralistic expertise: A Wittgensteinian approach to the rhetorical basis of expertise. Social Epistemology 25(3), 275–290. http://dx.doi.org/10.1080/02691728.2011.578307

Majdik, Z. P., & Keith, W. M. (2011b). Expertise as argument: Authority, democracy, and problem-solving. Argumentation 25(3), 371–384. http://dx.doi.org/10.1007/s10503-011-9221-z

Wilen, D. (2010, Oct. 27). School board gears up for MTA battle. Beverly Hills Patch. Available at http://beverlyhills.patch.com/articles/school-board-gears-up-for-mta-battle.

Wingspread Conference on the Precautionary Principle. (1998, Jan. 26). Precautionary principle. Retrieved from http://www.sehn.org/wing.html.