Part 1: Does Europe-based Precautionary Principle protect consumers from reckless corporations or retard innovation and undermine public trust in science? – Genetic Literacy Project

Nothing could be further from the truth.

Trustbusters is a three-part series on how experts and authorities are failing to manage public trust in research and technology. The demise of trust (in authorities, science, institutions, expertise…) is leading to a breakdown in societal and institutional structures, political deadlock and a decline in scientific literacy.

Part 1 looks at how a policy tool (the EU’s use of the precautionary principle) has destroyed trust in science.

The second part looks at how our misunderstanding of the difference between misinformation and disinformation mirrors the mistrust/distrust distinction and how our reaction to suspected disinformation is further affecting public trust.

The last part simply asks if the arrogance portrayed by certain scientists and science communicators is to blame for a public distrust of technological innovations.

Part 1 looks at how a policy tool (the EU’s use of the precautionary principle) has destroyed trust in science.

The second part looks at how our misunderstanding of the difference between misinformation and disinformation mirrors the mistrust/distrust distinction and how our reaction to suspected disinformation is further affecting public trust.

The last part simply asks if the arrogance portrayed by certain scientists and science communicators is to blame for a public distrust of technological innovations.

Guilty until proven innocent

While there are many versions of the precautionary principle, the one used most widely (and by the European Commission in Directives like REACH and the Sustainable Use of Pesticides) is the European Environment Agency (EEA) interpretation commonly known as the reversal of the burden of proof. Rather than testing whether a substance or process may cause harm or be of concern, this version demands that everything must be proven to be safe before being allowed onto the market. Of course, “safe” and “certain” are emotional concepts that are open to wide personal interpretation (and scientists would never use this language, opting for a continuous process of attaining “safer” and “more reliable”).

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

The EEA definition leaves researchers in a “guilty until proven innocent” situation where the demand for innocence is impossible to meet. This has caused a cottage industry in the “forever doubt” activist fear campaigns like endocrine disruption, GMOs and insect apocalypse where campaigners use old (poor) data to disregard better research and continue with their “just not good enough” strategy. So long as the benefits of such technologies can be explained away, precaution remains the (lucrative) activist weapon of choice. Other longstanding campaigns (like acrylamide, dioxins or EMFs) did not meet the “forever doubt” criteria because the benefits of the technologies or processes are too high or exposures too banal, highlighting the foolishness of the reversal of the burden of proof approach.

The precaution game is rigged as there is no such thing as 100% safe. Scientists have been trained to develop risk management methods to ensure that products and processes are safer (a continuous process of developing ever better results while lowering exposure – risk – to as low as reasonably achievable – ALARA). Safe is an ideal (an emotional state) that we should work towards but will never reach. There are always eventualities or unforeseen possibilities, however remote. When the precautionary principle demands that we prove a technology or product is safe before it is allowed onto the market, the answer to any additional research data will always be “Sorry, but this is still not enough”.

Any application of law is based on certain values. Precaution as “guilty until proven innocent” could easily be applied to ban coffee, automobiles, solar panels and organically-grown produce. But the judge and jury are very selective (ie, political) in their use of this principle, basing their application on a rather arbitrary distinction of natural versus synthetic. Natural foods that are known endocrine disruptors are tolerated and even fêted for their health properties while a plastic or synthetic pesticide that cannot prove (with certainty) that it does not have trace endocrine disrupting properties must face an immediate removal from markets.

Precaution is a political construct, activist-led, at ease with its internal hypocrisies and not at all scientific.So why is the European Environment Agency claiming that their interpretation of the precautionary principle is the only way to ensure trust in emerging technologies?

The denial trap

One of the clever tricks of these forever doubt activists is the denial trap. So long as researchers are being led on the “prove to me it’s safe” goose chase, they are not framing the narrative, not demonstrating the benefits and not promoting the values of their innovation. They are caught up in a denial trap, continuously trying to prove that their technology is safe rather than demonstrating what it can do or developing even safer iterations. And if you are spending all of your time trying to prove that something is not harmful (ie, safe), and you cannot, what are you actually saying that is positive about your innovation?

GM technology is a prime example of the denial trap – an NGO hostile hoodwink. When these innovative seed technologies emerged in the public eye in the 1990s, the promise and the benefits were enormous. But the innovators got caught up defending their innovations in a forever doubt vortex.

And after five years of the denial trap, the skittish European Commission decided that we did not need this technology, declared an effective moratorium and European research and innovation was set back a generation. With new plant breeding technologies, the anti-tech, anti-industry activists are pulling out the same playbook, and guess what? The scientific community has fallen right back into their trap. Trust was destroyed 25 years ago and it is not coming back any time soon.

Proving that a technology or substance is safe is a game scientists will never win. Safe enough is just never enough. The only way out of the denial trap is to focus on the benefits. At the beginning of the millennium, there was concern about the health risks of mobile phones (radiation exposures could cause brain tumours, leukaemia, other cancers… especially among young people). There were viral videos of three phones ringing at the same time being able to pop a popcorn kernel. We did not really need this new technology (then) and there were strong voices of concern about other EMF risks (EMF frequencies, microwaves, phone masts, 3G…). Rather than falling into the denial trap, the mobile phone industry continued to improve their technologies (make them safer) and continued to stress the benefits. The EMF activists lost the narrative and ran out of cash.

The taint of precaution

Precaution acts like a plague on any innovative technology. If activists paint any product or substance as uncertain or potentially unsafe to human health or the environment, the stench emanates via the media and causes the innovation to be forever tainted. Once a substance would be blacklisted, once the public has doubt forever etched on their minds, once scientists fall into the denial trap, an innovation or technology is as good as gone.

Perception of a risk is nine tenths of the law in the court of precaution and facts matter little in such situations. If people think chemicals leeching out of plastics in food packaging makes them go sterile then they will spend the extra money for glass, demand plastic-free alternatives and then, in predictable zealot-fashion, demand a total ban of their use so others have to pay more to justify their irrational fear. No studies will ever replicate their original purpose-built studies but this was never necessary in a precaution-based docilian world where the public has come to expect 100% safe (without actually understanding what this implies).

Tainting a substance or technology as potentially unsafe has created opportunities not only for the legions of activist zealots and NGO fundraisers but also for the competition. Building a market and surviving in a competitive environment is difficult work for many companies. It could be made easier if your competition is facing the forces of forever doubt and caught in the precautionary denial trap. If a substance is under suspicion, the supply chain will quickly retool with alternatives so that they will not get left behind once (when) the precautionary principle is applied. And once these downstream users have an alternative to the substance stuck with the stench of precaution, they may just play their card and promote the forces of forever doubt. Fait accompli.

Br-Br-Br-Bromine! Brominated flame retardants is a good case study in how the stench of precaution, when allowed to waft throughout the supply chain, can lead actors in competitive industries to promote this forever doubt for their own advantage. Bromine-based flame retardants worked well to ensure plastics (a petroleum-based substance) would not burn. This was a valuable safety feature added to plastics across a wide range of electronics products including computers (that would heat up and, if not protected, burn spectacularly … as countless YouTube videos attest). An alternative to brominated flame retardants was aluminium, which was not flammable, but very expensive and energy intensive. In the 1990s and early 2000s, a series of studies started to raise doubt on the safety of brominated flame retardants and the industry fell into the denial trap. More studies were done finding the substance in the environment (it was persistent, which was its value as a flame retardant) and NGOs like Greenpeace started to do (very expensive) biomonitoring tests to show bromine accumulating in blood and tissue samples (at very small, non-hazardous levels). Fear of being blacklisted with a tainted substance, the supply chain started to look for alternative substances that would not burn. Greenpeace launched the famous Green my Apple campaign along with their annual green electronics guides condemning any IOMs that dared to use PVC or brominated flame retardants. Sure enough, in the mid-2000s, computer companies, led by Apple, started coming out with (much more expensive) aluminium computers. Bromine (and many plastics) are now becoming a legacy substance, condemned not by science and facts, but by the taint of precaution brought about by some ruthless, stealth lobbyists from the aluminium industry who bought off some gullible activist scientists and NGOs.
We should not overlook how the original campaigns, regulatory pressure and doubt-driven research against bromine came from Norway and Washington State. What do these two areas have in common? They are home to the largest aluminium companies and smelters. The aluminium lobbyists also tried that same trick in trying to legislate out polyethylene-based automotive fuel tanks (another deception I observed first hand) but the car industry was too strong and not as stupid.

Uncertainty – The handmaiden of distrust

When there is doubt, distrust is not far behind. Precaution does not build trust but rather preys on a lack of trust populations may have with their governments, scientists and industries. For activists working for a cause (or an alternative technology) raising alarm bells and causing distrust is an easy process – find a fear, raise doubt and communicate the hell out of it. Then inform a frightened public that the technology is not necessary or that alternatives exist (in the case of nuclear, Greenpeace’s alternative was two more decades of coal … sweet!). Our trust perceptions are emotional and, like any emotion, they do not pay much heed to facts or reason.

Precaution cannot succeed in situations where the public trusts the substance or process or appreciates the benefits. There are certain elements of trust (covered before) that can precaution-proof any substances or processes:

Familiarity. One of the key elements of trust is familiarity. If we have been using something for generations (like toasting bread) then any precautionary fear campaign (like acrylamide) would be met with derision and laughter. If I am not familiar with a scary-sounding chemical name, and I am told it is not necessary, than any whiff of doubt will have policy-makers reaching for their precautionary pillbox.

Agency. Another element of trust is agency. If I have the ability to take control (if I am empowered with a decision), I trust the process more. Why I feel safer driving a car rather than sitting on an airplane even though the statistics would prove that to be an irrational belief. With many technologies today, the public feels helpless – that they are being exposed to substances in their environment without their consent. Even if the exposure is harmless, taking precaution is equivalent to taking control (and is an easy political win for skittish government officials).

Kinship. Trust is often based on a kinship – an identification with some population, region or cultural practice. Trust is relational. We trust those like us and the rise of social media communities (tribes, echo-chambers) has been like catnip to precaution-mongers. When enough people in my community scream “We don’t want this!” then a precautionary voice is born. There are far fewer outspoken scientists, farmers or energy experts so the voice of the mob, with their precautionary torches, will easily win out over the voice of facts and reason.

Precaution does not build trust. So when an activist like David Gee, the architect of the European Environment Agency’s formulation of the precautionary principle (and previously a director at Friends of the Earth) claimed that precaution is necessary to rebuild trust in science, we can easily see how he was, well, full of shit. Gee’s objective was just the opposite – to put scientific innovations in impossible situations pitted against populations demanding safety and certainty while rejecting anything that was not familiar, within their control or like them. After the risk crises of the 1990s (GMOs, MMR, EDCs, acrylamide, dioxins…), the precautionary principle was the coffin that sealed away any hope for improved trust in research and technology.

And after two decades of this miserable policy tool, we are reaping the rewards of precaution – the public rejection of beneficial technologies like vaccines, biotech and medicines.

Precaution’s monster – The anti-vaxxer

Those in Brussels who support the precautionary principle are the proud parents of the anti-vaxxer. They created the conditions for this monster to flourish and are now reaping their rewards.

Vaccine hesitancy has thrived on distrust of science, governments and industry. With the new mRNA vaccines developed to help battle the spread and reduce the impact of COVID-19 comes an unfamiliar technology with fear of uncertain consequences. Anti-vaxxers would rather take control (agency) of preventing potential harm from the virus (by whatever often useless means those in their communities propose) rather than trusting a new technology.

And as more people question the safety of vaccines, officials and science communicators find themselves, that’s right, falling into the denial trap. That vaccines don’t cause miscarriages; they don’t cause heart attacks; they don’t cause blood clotting … and so the game goes on. Those who don’t engage in the anti-vax forever doubt formulations then commit the sin of arrogance: calling out those who are vaccine hesitant as lunatics and a threat to society. Hardly a means to build trust in research and technology. The science communications solution – create a vaccine mandate to ostracise those who are afraid and uncertain. How do you spell “stupid”?

The unprecautionary EU precautionary principle formulation

A vaccine is precautionary by its very nature – under the triple-negative interpretation, vaccines are safer than the alternative (a virus or a disease) which has a higher likelihood of causing serious harm. How did this then get turned on its head? The “safer with technology” approach is not the precautionary principle that the European Environment Agency has successfully campaigned for (demanding certainty instead, that a substance like a COVID-19 vaccine is 100% safe). Vaccines can never be 100% safe so we are using the worst interpretation of precaution at the worst possible time. When there was a question of a few potential cases of blood clotting from the AstraZeneca COVID-19 vaccine, European Commissioner for Industry, Paolo Gentiloni, stressed the need for certainty and praised the EU’s application of the precautionary principle to suspend use (until the European Medicines Agency came in and restored scientific logic). It was too late though and, predictably, public trust in the AstraZeneca jab cratered. The impossible conditions that precaution as the reversal of the burden of proof created has led to a public health crisis far greater than any potential unknown hazards. This has once again become a fruitful activist campaign tool to sow distrust in science and technology. Without the activist, politicised version of precaution, the voice of the anti-vaxxer would fall silent with the benefits of the vaccine promising a safer outcome.

Someone like David Gee, who spent his entire life campaigning against chemicals and mobile phones, created a policy tool monster to help him fix the game to win in Brussels but that is now causing severe harm to a population that bought into his fear-driven nonsense. So David, 20 years on, within this carnage of distrust, fear and needless death, where is the trust that you promised that your precious little principle would deliver?

David Zaruk is a professor at Odisee University College where he lectures on Communications, Marketing, EU Lobbying and PR. He also provides training courses and is a regular keynote speaker. In the past, he has been employed by Solvay, Cefic and Burson-Marsteller, retiring from “active work” in 2006.

A version of this article was originally posted at The Risk-Monger. You can follow David, The Risk-Monger, on Twitter @Zaruk