The issues transcend scientific expertise, says Matthias Kaiser.
The issue of good governance in pathogen research was highlighted by the recent controversy about whether to publish research on the H5N1 bird flu virus engineered to be transmissible between mammals. Our EU-funded project Value Isobars examined the principled issues behind this sort of debate.
Unpacking the issue
At the outset, the problem seems simple enough. Virtually all parties seem to agree on the importance of protecting two central values: the freedom of academic research, including the importance of public knowledge; and the importance of preventing bio-terrorism. The problem is often presented as a trade-off between these two considerations: how much scientific freedom do we have to sacrifice in order to prevent bio-terrorism.
However, on closer inspection the problem dissolves into a number of more complex, related problems. Firstly, the public may question the framing of the issue: who says that this is the only option, and do we have reason to trust them? Secondly, scientific freedom as a central value for governance is usually justified as a means to an end. Thus the trade-off mentioned above is actually the weighing of intended benefits against the unintended risks of the research. Thirdly, even if we agree on the need for some action to intensify bio-security, it is by no means certain what the most appropriate governance tools might be.
Very few doubt the good intentions of the researchers involved here. Experts are agreed that the question is not if we are to experience a flu pandemic, but when. It is this certainty that justifies the pathogen research. Some of the main uncertainties relate to the issue of the misuse of the research results: who are the malevolent parties and do they actually have the capacity to utilize the scientific knowledge for their purposes? Scenarios are set up, ranging from rogue states to skilled terrorists to the amateur garage-scientist. The question then boils down to good governance of science in the light of very large system uncertainties and high value stakes. Arguably this then transcends the role of scientific expertise and opens the discussion for a democratic debate involving
But what governance tools should one employ, given that one sees the need to counteract possible bio-terrorism? Oversight by national or international bodies is one such tool, as in the current case. This typically raises the question of professional, institutional or national bias in the proposed measures. In Value Isobars, we stressed the potential role of a voluntary code of some kind: whether we have sufficient reason to believe that, adopted by the scientists, it will have the bite to enforce a security culture that provides some assurance against misuse.
Trust and the public
There is little reason to believe that knowledge of any kind, once it is in the possession of a few, will not eventually seep out to others. Thus, censoring pathogen research or limiting its access to publication channels is not a good idea. Furthermore, we would be discouraging the kind of research that we soon might need in the light of expected pandemics. Yet, an attitude of laissez-faire might not be appropriate either, since good science acknowledges uncertainty and prepares for intended and unintended outcomes.
We should instead build up the trust that those who actively engage in this research do so fully aware of their social responsibility and are enforcing an internal security culture.
For this we need to engage larger segments of the public. We need our science to prepare for the future, but we cannot expect zero-risks or elimination of uncertainty. A voluntary code broadly discussed and then adopted by the scientists might provide assurance to the public that all that can be done to avoid harm is done.