On the 15th of September the European Commissioner for Research, Science and Innovation Carlos Moedas gave a speech on independent science advice to policy. In this speech he made a very interesting statement: “Policy decisions taken with the support of independent scientific advice will be decisions our citizens can trust”.
I totally agree with him that evidence-informed decisions can be trusted. This does not mean, however, that they are trusted. Eurobarometer surveys carried out by the European Commission in 2010 and 2013 (Eurobarometers 340 and 401) consistently showed that almost two thirds of the EU population think that science “is making our ways of life changing too fast”, while three quarters of EU citizens feel that science and technology “could be used by terrorists” or have “unforeseen side effects that are harmful to human health and the environment”. 52% of the EU population even think that because of their knowledge scientists have a power “that makes them dangerous”. These statements do not necessarily mean that people are turning anti-science – overall, citizens overwhelmingly support science and technology. However, they do indicate an increasing uneasiness among citizens about science and technology dictating political choices. Consequently, public support for evidence-informed policies cannot be taken as a given.
This issue is closely related to the acceptance of new technologies. Nanotechnology is a perfect example. Less than five years ago cosmetics were still advertised with claims like “containing innovative nanosilver”. I don’t know whether anybody noticed, but all these claims have disappeared from the adverts. On the contrary, you can find now advertisements claiming that products are “non-nano” or “nano-free”. This shows that public opinion on nanotechnology seems to reach a tipping point, where writing pro-nano claims on a product is rather seen as counter-productive for the success of the product on the market. Whenever claims like “free of…” appear in the media, the alarm bells should ring in scientific, business, and policy circles.
It is true that concerns of citizens about the safety of technologies are often not rational from a scientific point of view and are rather based on gut feelings. This is particularly true for everything we cannot capture with our senses, i.e. which we cannot see, hear, feel, taste or smell. Examples include radioactivity, electromagnetic radiation (such as WiFi), viruses, genetic modification and nanoparticles. I do not want to discredit these feelings: they are quite natural. It’s an instinct that has helped us to survive when we were still living in caves. Just go into a completely dark room, close the door and you know what I mean.
At the end of the day everything is about trust. Do we trust scientists? Do we trust industry? Do we trust politicians? Do we trust the media? Luckily, among these groups scientists still rank the highest on the trust scale, which is good. But it’s not sufficient if we want to get public buy-in into a new technology or into a policy decision based on evidence. If we want progress – and this inevitably means taking risks – we need to build trust in evidence-informed policy-making and, hence, in science and technology. Let’s not forget that we have a huge challenge ahead: just have a look at the recently adopted Sustainable Development Goals and their ambitious targets and you will understand that humanity simply cannot afford the luxury of discussing a technology for 20 years as we did with GMOs.
So what needs to be done to build this trust?
First and foremost we need to be open and transparent. Wikipedia is the perfect example. Everybody trusts Wikipedia and uses it on a daily basis. The trust is going so deep that it has even undermined Encyclopaedia Britannica and similar libraries with all their thousands of reviewing university professors. Why do we trust Wikipedia? Because it is open and has transparent rules. This gives us the confidence that the collective brain of the millions out there will sort out the wrong-doers (which of course also exist on Wikipedia). This transparency we need in science, but even more so in industry and politics.
Second, we need to communicate. Unfortunately, we don’t educate our students in science communication. Therefore, science communication largely depends on few scientists that have a talent and passion for communicating, on press offices that sometimes tend to overstate scientific achievements and on science journalists who have a tough time in preserving their jobs. It must be a community-wide effort. Every single scientist is called to communicate what he or she does – and this in a language lay people understand. So don’t be afraid of calling microbes “bugs”, even if your scientific peers may raise their eyebrows.
Third, we need to engage in societal dialogue. It’s not enough to send out nice press releases: scientists need to go into townhall meetings, be they physical or virtual, and face the citizens. In so doing, it is of utmost importance that we show empathy for citizen’s concerns. It’s not helpful to bang the evidence on the table and say it’s all there. In fact, there is nothing wrong with showing that also we, as scientists, are citizens with hopes and fears. We are excited or worried about things like everybody else. Societal dialogue can achieve true miracles. For instance, it is the reason why Sweden is the only country in the world that has really solved the problem of finding a nuclear waste repository. Nuclear industry, scientists and politicians went out there and discussed with citizens and in the end – combined with the political promise of hundreds of jobs to be created – there was even a competition between municipalities eager to host the waste repository.
We need to think about how to organize public dialogue in a better way and learn from best practice (another example is Sciencewise in the UK). The most crucial point of this dialogue is timing. It must happen at a point in time where a technology is sufficiently developed to have an idea of the benefits and risks, but still early enough that people don’t feel that they are faced with a fait accompli and it is just a matter of convincing them. People must feel that their opinion really matters and that they can influence developments (which also means: before industry is investing large-scale in the deployment of a technology). Another key issue is who is organizing / moderating the debate. In my view, it should neither be science, nor industry, nor NGOs, nor politics, nor churches as they will all be seen as having a stake. All of them must participate for sure, but they will not be seen as neutral. Science museums could play such a role. Being cultural organizations they are highly trusted – after all, parents and grandparents love to go to science museums with their kids. They are perceived as competent on the matter, but still as being at an arm’s length from the researchers. It also fits their agendas, as museums increasingly move from exhibiting artefacts and experiments to organizing dialogue. Last but not least, they also have the organizational capacities and facilities to run such a dialogue.
And fourth, we need to talk about ethics. Many concerns citizens have are not about technical issues: they are worried about ethics. About scientists who change the genomes of viruses that could make them more deadly, about industry polluting the environment to make maximum profit, about media providing biased reports, about politicians not being honest about their motivation. People are sick of bankers gaining bonuses while their banks are saved with tax-payer’s money, of multinational companies evading taxes while making billions of profits, of car manufacturers cheating regulatory authorities. We need to have a serious debate about this.