The human body so far is the ultimate 'wet computer' - a highly efficient, biomolecule-based information processor that relies on chemical, optical and electrical signals to operate. Researchers are trying various routes to mimic some of the body's approaches to computing. Prominent among them is DNA computing, a form of computing which uses DNA and molecular biology instead of the traditional silicon-based computer technologies (see our Spotlight: "Molecular automaton plays tic-tac-toe"). Not limited to DNA, "gooware" computer scientists attempt to exploit the computational capabilities of molecules. In doing so, they expect to realize faster (massively parallel), smaller (nanoscale), and cost efficient (energy-saving) information processing devices that are very distinct from today's silicon-based computers.
It has been 25 years since the scanning tunneling microscope (STM) was invented, followed four years later by the atomic force microscope, and that's when nanoscience and nanotechnology really started to take off. Various forms of scanning probe microscopes based on these discoveries are essential for many areas of today's research. Scanning probe techniques have become the workhorse of nanoscience and nanotechnology research. Given the 25-year development timeframe, it is surprising that even today there is no generally accepted standard for scanning probe microscopy (SPM). There is no unified SPM terminology, nor is there a standard for data management and treatment, making access and processing of SPM data collected by different types of instruments an error-prone exercise. SPM standardization has only recently begun as part of an effort by the International Organization for Standardization (ISO) the largest developer of industrial standards. Meanwhile the development of SPM instruments and analysis software continues, increasing the already large family of scanning probe microscopy.
The semiconductor industry is on its way to 32 nm processor technology, expected to be commercialized around 2009, and the day might be near when transistors will reach the limits of miniaturization at atomic levels and put an end to the currently used fabrication technologies. Apart from the issues of interconnect density and heat dissipation, which some researchers hope to address with carbon nanotube-based applications, there is the fundamental issue of quantum mechanics that will kick in once chip design gets down to around 4 nm. This is where semiconductor dimensions have become so small that quantum effects would dominate the circuit behavior. Computer designers usually regard this as a bad thing because it might allow electrons to leak to places where they are not wanted. In particular, the tunneling of electrons and holes - so-called quantum tunneling - will become too great for the transistor to perform reliable operations. The result would be that the two states of the switch could become indistinguishable. Quantum effects can, however, also be beneficial. A group of researchers has now shown that a single bit of data might be stored on, and again retrieved from, a single atom. Just don't expect this in your computer anytime soon, though.
In our May 7 spotlight "The potential and the pitfalls of nanomedicine" we took a general look at the potential implications of nanomedicine and addressed some ethical issues that arise as the technology develops. In part two of this article we now take a closer look at emerging nanomedical techniques such as nanosurgery, tissue engineering, nanoparticle-enabled diagnostics, and targeted drug delivery. Again, the ethical issues inherent in these emerging medical technologies need to be considered. There are established principals for ethical assessment of existing, conventional, medical technologies and a new research article examines if and how these principals can be extended to nanomedicine.
There are quite a number of terms such as bionics, biomimetics, biognosis, biomimicry, or even 'bionical creativity engineering' that refer to more or less the same thing: the application of methods and systems found in nature to the study and design of engineering systems and modern technology. A relatively new entry in this list is 'nanomimetics', an area of biomimetic nanotechnology that tries to duplicate what nature has been doing for billions of years on this planet - creating and manipulating complex nanoscale structures. Nanoscientists and nanotechnology researchers use the terms 'self-assembly' and 'bottom-up fabrication' in their efforts to copy the best nanotechnologist around - Mother Nature. Managing another small step in this direction, researchers now have reported the accurate replication of fragile biological nanoscale shapes normally associated with self-assembly by using a robust top-down lithographic technology. The ability to replicate biological shapes with nanoscale precision could have profound implications in tissue engineering, cell scaffolding, drug delivery, sensors, imaging, and immunology.
In 2005, researchers in the Netherlands developed the concept of a "molecular printboard" (named for its parallels with a computer motherboard) - a monolayer of host molecules on a solid substrate on which guest molecules can be attached with control over position, binding strength, and binding dynamics. Molecules can be positioned on the printboard using supramolecular contact printing and supramolecular dip-pen nanolithography. In this way, nanoscale patterns can be written and erased on the printboard. This technique, which combines top-down fabrication (lithography) with bottom-up methods (self-assembly), has now been applied to proteins. The resulting "protein printboards", allowing the capture and immobilization of proteins with precise control over specificity, strength and orientation, allows the fabrication of protein chips for applications in proteomics. They will play a major role in unraveling the human protein map, just as special chips were instrumental in mapping human DNA.
Throughout human history, technologies usually involved some kind of "top-down" approach, whether it was breaking a stone axe from a larger rock or using micro- or nanolithography to etch smaller structures from larger entities. In contrast, the self-assembly of nano-objects is an example of the principally new "bottom-up" technological approach which soon may provide novel fabrication processes and products with drastically improved properties. In particular the self-assembly of colloidal nanocrystals makes it possible to obtain structures with a high level of ordering and permit construction of patterns to be used in optoelectronics, photonics and biosensing. What makes nanocrystals so attractive to researchers is the fact that the properties essential to allow the arrangement process, including their size, shape, surface protection, stabilization and charge, can be controlled along with the electronic structure of each nanocrystal. As an example, we developed a "lab-in-a-drop" technique where a variety of nanostructures with desired properties may be produced.
Nanofluidic channels, confining and transporting tiny amounts of fluid, are the pipelines that make the cellular activities of organisms possible. For instance, nanoscale channels carry nutrients into cells and waste from cells. Researchers are trying to mimic Nature by constructing nanochannels in order to be able to manipulate single molecules in, predominantly biomedical, applications. Although nanochannels adjustable in size are prevalent in Nature, it is challenging to fabricate them artificially because of conflicting requirements for rigid structural integrity (to prevent collapse) on one hand and reconfigurability of nanometer-sized features on the other (to allow adjustability). Recent work at the University of Michigan addresses these issues and introduces methods to rapidly prototype structurally stable yet reconfigurable nanochannels. By fabricating tuneable elastomeric nanochannels for nanofluidic manipulation, the researchers were able to properly balance the need for flexibility and rigidity.