Behind the buzz and beyond the hype:
Our Nanowerk-exclusive feature articles
Posted: Jul 22, 2008
Late lessons from early warnings for nanotechnology
(Nanowerk Spotlight) One term you hear quite often in discussion about the potential risks of nanotechnology is 'precautionary principle'. This moral and political principle, as commonly defined, states that if an action or policy might cause severe or irreversible harm to the public or to the environment, in the absence of a scientific consensus that harm would not ensue, the burden of proof falls on those who would advocate taking the action. The principle aims to provide guidance for protecting public health and the environment in the face of uncertain risks, stating that the absence of full scientific certainty shall not be used as a reason to postpone measures where there is a risk of serious or irreversible harm to public health or the environment.
In 2001, an expert panel commissioned by the European Environment Agency (EEA) published a report, Late Lessons from Early Warnings: The Precautionary Principle 1896–2000, which explored 14 case studies, all of which demonstrated how not heeding early warnings had led to a failure to protect human health and the environment. This report's stated goal was to gather "information on the hazards raised by human economic activities and its use in taking action to protect better the environment and the health of the species and ecosystems that are dependent on it". It looked at controversial topics such as asbestos, Mad Cow Disease, growth hormones, PCBs and radiation – all of which demonstrated how not heeding early warnings had led to a failure to protect human health and the environment.
The expert group that compiled the EEA report identified 12 'late lessons' on how to avoid past mistakes as new technologies are developed.
"These lessons bear an uncanny resemblance to many of the concerns now being raised about various forms of nanotechnology" Steffen Foss Hansen tells Nanowerk. "A comparison between the EEA recommendations and where we are with nanotechnology shows we are doing some things right, but we are still in danger of repeating old, and potentially costly, mistakes."
Hansen, a researcher in the Department of Environmental Engineering and NanoDTU Environment at Technical University of Denmark (DTU), together with Anders Baun from DTU, Andrew Maynard from the Project on Emerging Nanotechnologies and Joel A. Tickner from the Department of Community Health and Sustainability at the University of Massachusetts, just published a commentary in Nature Nanotechnology in which they explore these 12 lessons in the context of nanotechnology ("Late lessons from early warnings
These are the 12 lessons outlined by the EEA:
Acknowledge and respond to ignorance, uncertainty and risk in technology appraisal.
Evaluate alternative options for meeting needs, and promote robust, diverse and adaptable technologies.
Provide long-term environmental and health monitoring and research into early warnings.
Ensure use of ‘lay’ knowledge, as well as specialist expertise.
Identify and work to reduce scientific ‘blind spots’ and knowledge gaps.
Account fully for the assumptions and values of different social groups.
Identify and reduce interdisciplinary obstacles to learning.
Maintain regulatory independence of interested parties while retaining an inclusive approach to information and opinion gathering.
Account for real-world conditions in regulatory appraisal.
Identify and reduce institutional obstacles to learning and action.
Systematically scrutinize claimed benefits and risks.
Avoid ‘paralysis by analysis’ by acting to reduce potential harm when there are reasonable grounds for concern.
Hansen, Baun, Maynard and Tickner argue that the question seems not to be whether we have learnt the lessons – as outlined by the EEA report – but whether we are applying them effectively enough to prevent nanotechnology being one more future case study on how not to introduce a new technology.
In their commentary, the scientists go through the 12 EEA lessons and, where appropriate, apply them to the current issues, regulatory and commercial activities, and scientific knowledge surrounding current nanotechnology developments. Hansen summarizes his and his colleagues concerns in three key points:
1) We seem to ignore some valuable lessons from the past and thus are in danger of repeating old, and potentially costly, mistakes.
2) The global response to these warning signs has been patchy. Although we have gotten the early warning signs, we risk putting off future commercial and social benefits of nanotechnology by not addressing the EHS- and regulatory issues fullheartedly and efficiently
3) Many governments still call for more information as a substitute for action.
In an ideal world, politicians and regulators would look to scientists as the ultimate authorities when it comes to making regulatory decisions about technology risks and science would be able to provide useful and unambiguous answers (for the sake of this argument, let's ignore for a moment the tremendous influence of industry lobbies and the distorting effects of political ideology). Here, the article provides a striking example of what it calls 'institutional ignorance' – instances where research throws up useful information which then is ignored and overlooked by the regulators:
"They [the authors of the EEA report] cite cases where regulators made inappropriate appraisals because of the blinkers imposed by their specific disciplines – such as the preoccupation of medical clinicians with acute effects when dealing with radiation and asbestos. There is a real danger of similar errors being made with nanotechnology, which crosses many fields of expertise. One needs to draw on physics, chemistry, computer sciences, health and environmental sciences to understand nanomaterial properties and risks. Consequently, a number of multidisciplinary centers for nanoscience and nanomanufacturing have been established around the world, but only a few of these address health, environmental, and social aspects. Setting aside resources to create an infrastructure that gets people working together across disciplines is critical."
Hansen and his co-authors argue that, "despite a good start", it seems that we have become distracted because
nanotechnology is being overseen by the same government organizations that promote it;
research strategies are not leading to clear answers to critical questions;
collaborations continue to be hampered by disciplinary and institutional barriers; and
stakeholders are not being fully engaged.
"In part this is attributable to bureaucratic inertia" says Hansen, "but comments from some quarters – such as 'risk research jeopardizes innovation' or 'regulation is bad for business' – only cloud the waters when clarity of thought and action are needed."
The authors concede that the picture is not as bleak as it could be: "Although progress towards developing sustainable nanotechnologies is slow, we do seem to have learnt some new tricks: asking more critical questions early on; developing collaborations that cross discipline, department and international boundaries; beginning the process of targeting research to developing relevant knowledge; engaging stakeholders; and asking whether existing oversight mechanisms are fit for purpose. But are we doing enough?"
Interestingly, they conclude their commentary with a cautionary and feeble 'perhaps': "If we are to realize the commercial and social benefits of nanotechnology without leaving a legacy of harm, and prevent nanotechnology from becoming a lesson in what not to do for future generations, perhaps it is time to go back to the classroom and relearn those late lessons from early warnings."
Shouldn't it be a basic requirement for our regulators and science community to heed lessons from the past?