Today I want to write about another topic both industry and NGOs frequently get excited about: the precautionary principle. The precautionary principle is very important because it prevents us from doing stupid mistakes which we may later regret.
The European Environment Agency has published two reports called “Late lessons from early warnings”, describing instances where humankind got it wrong and could have reacted earlier listening to warnings issued by scientists (see here and here). Asbestos is a frequently quoted case. The two volumes give plenty of examples, but they are of course all based on the fact that we are always wiser in hindsight.
One could equally produce a book called “Late lessons from early opportunities” and would find an equal number of examples where we got it wrong either because we didn’t want to take the risk or because politicians and business leaders followed other imperatives. The history of standards provides many such examples and I highly recommend reading this article on why certain countries drive on the left-hand side and others on the right-hand side and what Napoleon has to do with all of this.
However, these hindsight stories, as instructive as they may be, do not help us if we need to take a decision now. With the almost logarithmic expansion of our knowledge and the accelerating speed of technological development in an ever more complex and interconnected world, politicians are more and more faced with such problems. Take for instance genome editing. It’s a technology that has huge potential, e.g. to create drought-resistant crops or prevent genetic diseases. However, as with all technologies, it has its pitfalls, touches on ethical issues and could be used for doing bad things. So what should we do? Ban it? Regulate to death as we have done with GMOs? Let the scientists do what they want? I don’t want to be a politician in such cases. Even scientists would find it hard to predict all the consequences of taking a decision either way.
It’s a typical example where the precautionary principle, as it was originally intended, can help us. The purpose of the precautionary principle is not to block progress. It’s intention is to allow us to go ahead, but with precaution: take the time to reflect, discuss ethical implications, gather additional evidence, if required regulate. Some researchers may see this as a bureaucratic burden, but it is a societal discussion we need to have (which does not mean discussing it forever though).
It is naïve to believe that inventions can somehow be de-invented. Once the knowledge is there, it’s there. It’s like with toothpaste: once it left the tube, we won’t get it back in, no matter how much we try. Of course, everybody will agree that developing the atomic bomb was a bad idea. But it’s there. So what humankind has done is to regulate it, e.g. by preventing its proliferation. And we have been pretty successful in doing so because in the past 70 years since the last bomb dropped on Nagasaki we apparently managed to prevent the bomb being used again in a war. This does not mean that the risk is not there any longer, but we have it under control. We have taken precautionary measures.
We need to be clear about the fact that any discovery, invention or technological progress entails risks. No risk, no innovation. So we should apply the precautionary principle only in cases when there is serious concern in the scientific community that we might get it wrong, with irreversible consequences. Global warming is one of these cases. If we apply the precautionary principle too often, or misuse it for political purposes, the precautionary principle loses its teeth and we risk missing opportunities that others may take advantage of.
However, whether to take precautionary measures is a political decision, not a scientific one. It’s a decision to which scientific evidence contributes significantly, but which is also based on values and belief systems. The question we always need to ask is: Is the risk of getting it wrong so high that it is worth to disregard all the opportunities? Finding this fine line is not an easy task and it will differ from technology to technology, differ between cultural contexts, and also differ over time as our body of knowledge develops. We need to ask the question again and again, as we move forward.
But politicians should never say: scientists told us to do so. Neither should scientists try to impose their view on politicians. We should give the best possible advice, voice concerns where needed, but have to accept that it is up to politicians whether to follow the advice or not. This also means, of course, that if we screw it up, the responsibility is with the politician, not with the scientist. There will be more late lessons for sure. But hopefully also an increasing number of instances where we got it right.