An interview with Professor Sheila Jasanoff: On lessons from science, technology, and society

Joan Chang, Maanas Sharma, and Kushal Seetharam 

 

Edited by Manraj S. Gill*

Interview | Aug. 30, 2021

*Email: manraj@mit.edu

DOI: 10.38105/spr.n9a0lhvw2b

SJ_1

Figure 1: Professor Sheila Jasanoff.

Introduction

MIT Science Policy Review spoke with Professor Sheila Jasanoff about her pioneering work in Science and Technology Studies (STS), the role of the public in policymaking, and some of the important lessons and recommendations drawn from her work in STS. She is the Pforzheimer Professor of Science and Technology Studies at the Harvard Kennedy School, where she founded and directs the STS Program. Her work exploring the role of science and technology in law and public policy has been internationally recognized and the insights she shared are sure to benefit scientists interested in entering the policy field. This interview was edited for clarity.

Interview

Science Policy Review: You are one of the pioneers of the field of STS. What led you to create this field of study?

Sheila Jasanoff: When I was a professor at Cornell, I began looking at the regulation of chemicals in four countries. The surprise finding of that research was that it didn’t matter for policy that all these people were looking at the same science: They were making very different decisions based on the same published literature. One striking example was asbestos, one of the best known and most studied toxic substances. If you looked at regulations just in the US and the UK — sharing the same aims to protect public health and the same body of epidemiological research results — asbestos regulations were much more stringent in America than they were in Britain. It turned out that the interface between science and public policy was not the same in these countries. Then I started wondering: Well, how is it different, why is it different? And that’s what really led me into my research field of asking those broader questions: How does scientific knowledge link to decisions about public policy?

SPR: How has science policy improved since you started studying it?

SJ: The idea of what the ideal role of science in policy ought to be is one of those things that has been maturing. It used to be that people thought the ideal role was for the scientists to sit apart, create the best knowledge, and then feed it to the policymakers — to “Speak Truth to Power.” Behind that slogan was the belief that scientists alone are in the business of making the truth. Now, I think the scientific community has arrived at the position that they should be presenting options to policymakers, and the policymakers should choose among these options. That gets part of the way to a better relationship, but not all the way there.

SPR: Why is that? What is this approach lacking?

SJ: Part of the problem is whether the right questions are being asked in the first place. If you know that this is the right question, you can create informed policy options. But if you are not asking the right questions, the ones people care about, then no matter what the science says, you wouldn’t be producing the right answers or options for society.

So, let me take one example that is something I’m deeply involved in but that is also at the forefront of science and technology policy these days: Human genome editing. If you’ve decided, for instance, that we should be editing defects in the human genome, then you can start asking technical policy questions about how to do it safely and under what conditions you should carry out this activity. But if there is a widely accepted moral consensus that you should not be doing this in the first place, that this is a space we don’t understand well enough to intervene in — because who understands the reasons for the total complexity of genomic diversity across a human population, or because different religious traditions value humanness differently — then no amount of engineering to make sure that an intervention is safe will address the concerns of people who think the intervention is morally wrong to start with.

SPR: What is the greatest misconception you see in the field of science policy?

SJ: I think that there are certain things that one can learn. One of them is to not adopt this myth that science stands completely apart from society — that science is only about knowledge making and fact finding, that there is no politics to it. The slogan “Science is Real” can be very misleading. You are better off admitting that science happens within society, and proceeding from there. You must acknowledge that, sure, science is a very complex institutional space, but so is everything else — society is a very complex institutional space, and science is contained within it — and that therefore much more mutual exchange is needed for better understanding. I think that’s the sort of big and very basic lesson that American science policy could take away.

Also, I believe that at the moment, the pressure has to come from the younger generation to the elder, in a sense, because more senior people can get very comfortable doing what they’re doing, so the push to think something new has to come from somewhere else.

SPR: Some scientists and technologists argue that they already consult the community in their actions. What do you say to these people?

SJ: Sure, most people in the scientific advisory world today are aware that, for various reasons, they have to involve members of the public. That is a big shift from back when I got into the business and the main way that the public could get involved was through demonstrations — shut down this nuclear power plant, don’t let it be built, and so on and so forth. But if you look at the practices of how science tries to involve people, it’s rarely the kind of genuine two-way communication that I would like to see effectuated.

There still is an enormous reservoir of arrogance among technical people, the conviction that when push comes to shove, solutions lie in the technical domain, and the public needs to be better educated. But you hardly ever hear that education might need to flow in the other direction as well, that maybe people in science and technology (STEM) need to be better educated about democracy, law, and justice.

SPR: Could you expand on this idea?

SJ: We need a thicker idea of translation, not just between scientists in general and a monolithic public, but a more complicated deep recognition that science and scientists are not unitary — for example, a computer scientist, a wet lab geneticist, and a nuclear engineer do very different things — and similarly, there is no single general public.

I like to call this the Second Enlightenment. If the first enlightenment was clearing out superstitions about nature, the second enlightenment is about clearing out superstitions concerning science. We’re certainly not talking about people being anti-technology or anti-science. Rather, one has to see that skepticism toward science is often a resistance against a form of elite control. Once you’ve started understanding that, then phenomena such as climate skepticism or vaccine hesitancy become a democracy problem and not a science problem. And so we must train people to express what is really bothering them about science or technology, and what kinds of lives they want to lead — and that is a discussion for all of us to have together, not just people who command particular kinds of technological skills and capabilities.

“You are better off admitting that science happens within society, and proceeding from there. You must acknowledge that, sure, science is a very complex institutional space, but so is everything else — society is a very complex institutional space, and science is contained within it — and that therefore much more mutual exchange is needed for better understanding. I think that’s the sort of big and very basic lesson that American science policy could take away.”

So, you know — what kinds of societies should we have — it’s a really complicated question. This is the kind of problem we are trying to address in STS. It will never have simple, correct, mathematically clean answers. But how do you engage in those conversations so that the fixation on getting the right answers gives way to humility about asking the wrong questions?

I would begin one step back by already challenging the question — whether we’re talking about training for scientists or training for citizens. And maybe not even having a separate educational category called STEM to start with. We now have decades of studies of science and technology as social institutions — how knowledge is made, how knowledge is used and validated, when it’s worth listening to an expert consensus and when it’s not. Let’s teach that to people on both sides of the aisle, scientists and non-scientists, humanists and non-humanists. It’s not just about training scientists to learn about society, it’s about training everybody to think of a good society in which science and technology are and will remain fundamentally important factors.

But I also don’t want to leave this conversation by suggesting that I think there’s a magic bullet solution. I think that this is a matter of ongoing conversation and self-reflection.

“But you hardly ever hear that education might need to flow in the other direction as well, that maybe people in science and technology (STEM) need to be better educated about democracy, law, and justice.”

SPR: What do you recommend that people, particularly young scientists interested in policy, do to work towards this type of society?

SJ: I’ve been teaching STS since 1978, but I still haven’t resolved this question of what needs to be taught and what doesn’t. Nonetheless, in an undergraduate class I teach at Harvard — Numbers in Policy and Society — we’ve developed a crib sheet, which is just basic concepts that you need to know about the connections between science and society. My hope is that my students will be creative enough to figure out how to work with those tools and go somewhere with them, and it will not be this simplistic idea that “I am the technical person and I’ll educate the public”; it will be more sophisticated than that. Having a cadre of young scientists who are in the business of monitoring the conditions of a society in which science and technology play an important part — that is the sort of capacity that I’m trying to build.

Open Access

CC_logo

This MIT Science Policy Review article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

Joan Chang

Pardee RAND Graduate School, Santa Monica, CA

Maanas Sharma

School of Science and Engineering, Dallas, TX

Kushal Seetharam

Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, MA