Nanotechnology research is outpacing an outdated peer-review publishing process

(Nanowerk Spotlight) Consider this: in fields like nanosciences and nanotechnology the knowledge doubles in as little as five years, making a student's education obsolete even before graduation. But while the knowledge is growing exponentially, the established mechanism of getting this knowledge into the public domain has not changed much. This begs the question if the traditional scientific paper publishing model is still adequate and able to cope with the fast pace of how things develop in the scientific world. It can take up to two years from the time a scientific study is conducted to the actual publication of its findings in a paper in a peer-reviewed journal. By then, the underlying research might already be out of date.
For example, let's take a recent paper that was published in Environmental Science & Technology. Titled "Health and Safety Practices in the Nanomaterials Workplace: Results from an International Survey", the paper contributes to the growing body of discussion on this important issue.
This paper reports the findings of an international survey of nanomaterials firms and laboratories regarding their environmental health and safety programs, engineering controls, personal protective equipment, exposure monitoring, waste disposal, product stewardship, and risk beliefs.
But let's look at it in detail: The paper has been published online on April 1, 2008. The underlying survey, though, was conducted between June and September 2006 – almost two years ago. The sample of organizations to be contacted was constructed from an online database that contained 1,700 entries for nanotechnology organizations at that time (in comparison, the Nanowerk Link Directory as of today contains 3,324 entries). Since then there has been a lot of developments in this area that were unknown to the participants at the time of the survey (see for instance: "Protecting nanotechnology workers", or this overview: "Implementing successful voluntary nanotechnology environmental programs appears to be a challenge ").
Without going into detail here, or criticizing the authors, the question needs to be asked to what degree a reader can make any assumptions as to how relevant this paper's conclusions are in April 2008 – two years after the study was conceived and conducted – in such a fast-moving field as nanotechnology.
Of course it is naive to believe that researchers are always interested or encouraged to publish a paper on their research findings as soon as they have something to report. In today's research environment, where military research grants (at least in the U.S.) and corporate funding and commercialization aspects play such an important role, it often is much more important for researchers and their universities to secure patent claims first (some background: "Growing nanotechnology problems: navigating the patent labyrinth"). And scientists working for companies or military research outfits often will forgo publication altogether for the sake of competitive advantage or military secrecy.
Nobody would dispute, though, that it is in everybody's interest that research on environmental, safety and health aspects, like the one mentioned above, gets out as soon as possible so that it can be discussed, further developed, and acted upon.
So the real questions for the whole scientific paper publishing problem are these:
1) How can the publication cycle for nanoscience and nanotechnology research findings be sped up so that these papers reflect the current state of research more accurately and with as little delay as possible?
2) Are there alternative ways of getting the information out faster, i.e. new ways to publish raw data or partial findings prior to a full paper?
3) Are peer-reviewed scholarly papers actually still necessary at all?
In peer-reviewed journals, the actual process of 'peer review' takes up a considerable amount of time and it is the single most important factor in slowing down the publication process. While passing the peer-review process is often considered in the scientific community to be a certification of validity, it is not without its problems and the question is whether in its current form it is more a benefit or a burden.
Scholarly writers know that the peer-review system of scientific publications is not foolproof. Drummond Rennie, MD, deputy editor of the Journal of the American Medical Association has written this: "There seems to be no study too fragmented, no hypothesis too trivial, no literature too biased or too egotistical, no design too warped, no methodology too bungled, no presentation of results too inaccurate, too obscure, and too contradictory, no analysis too self-serving, no argument too circular, no conclusions too trifling or too unjustified, and no grammar and syntax too offensive for a paper to end up in print." Peer review determines where rather than whether a paper should be published, Rennie says. (Source: "The Maharishi Caper: Or How to Hoodwink Top Medical Journals")
In practice, some of the communication about new results in some research fields already takes place through preprints submitted onto electronic servers such as arXiv.org where they are available for free download. However, such preprints are often also submitted to refereed journals, and in many cases have, at the time of electronic submission, already passed through the peer review process and been accepted for publication.
Another aspect that will have an impact on the future of how scientific research will be published is the fact that scientific publishing has some features that sharply differentiate it from popular book publishing: Research papers are written by specialists for specialists and the authors do not receive any direct financial remuneration for their papers. They give them to publishers only in order to disseminate the information to other scientist. The journal publishers, on the other hand, are in this for the profit and they charge quite hefty amounts for these journals or even the individual paper.
Several initiatives to provide open access to peer-reviewed research articles and their preprints such as arXiv, eScholarship Repository, or PLoS could change this model dramatically. But as long as most scientists play by the established rules it is hard to see how this change will happen.
One of the arguments by proponents of this 'open access' approach is that if scientific research is funded with public money then its results should be freely available in the public domain. But the problem is not just the way scientific findings are published but more fundamentally the difficulty of accessing data that has been published, be it in peer-reviewed journals or elsewhere.
Initiatives like Science Commons are tackling this problem. As the group writes on their website: "There are petabytes of research data being produced in laboratories around the world, but the best web search tools available can't help us make sense of it. Why? Because more stands between basic research and meaningful discovery than the problem of search.
"Many scientists today work in relative isolation, left to follow blind alleys and duplicate existing research. Data is balkanized – trapped behind firewalls, locked up by contracts or lost in databases that can't be accessed or integrated. Materials are hard to get – universities are overwhelmed with transfer requests that ought to be routine, while grant cycles pass and windows of opportunity close."
The results from efforts such as Science Commons' would be a much speedier translation of data into discovery – unlocking the value of research so more people can benefit from the work scientists are doing.
Some scientists are discussing if a collaborative content management approach would work for scientific research, the idea being that groups of scientists develop content depositories for their field of expertise and constantly update them with their latest research. The resulting discussions among group members could well replace the existing peer-review process. One could even argue that it would be much better than a peer review process since the peers would be self selecting and not handpicked by a journal editor; and the process would be totally transparent.
Commonly termed 'wiki', such a content management approach is a web-based effort to create a community of contributors that work on specific issues, contributing and modifying content as they see fit. While the collaborative encyclopedia Wikipedia is one of the best-known wikis, scientific wikis could be set up as semi-closed communities, involving researchers from a particular set of fields. While only the community members are allowed to contribute and modify content, this content would be available for all to see.
Going even further, and probably the most revolutionary proposal, would be the opening up of raw research data for discussion and use by the scientific community. Such an approach has strong parallels to the 'open source' movement of the software industry, where developers freely distribute the source code and allow usage and modification. There are significant downsides, though, that need to be addressed – how relevant is copyright? The risk of plagiarism? Who benefits from patents arising from the work (or rather: who will get to patent new discoveries)?
If the underlying field of research is changing so fast that the traditional process of publishing the findings cannot keep pace, it appears to be only a question of time before the whole system will undergo a radical change.
Michael Berger By – Michael is author of three books by the Royal Society of Chemistry:
Nano-Society: Pushing the Boundaries of Technology,
Nanotechnology: The Future is Tiny, and
Nanoengineering: The Skills and Tools Making Technology Invisible
Copyright © Nanowerk LLC

Become a Spotlight guest author! Join our large and growing group of guest contributors. Have you just published a scientific paper or have other exciting developments to share with the nanotechnology community? Here is how to publish on nanowerk.com.