Diesel-burning engines are a major contributor to environmental pollution. They emit a mixture of gases and fine particles that contain some 40, mostly toxic chemicals, including benzene, butadiene, dioxin and mercury compounds. Diesel exhaust is listed as a known or probable human carcinogen by several state and federal agencies in the United States. Wouldn't it be nice if we could render diesel soot harmless before it gets released into the environment? Wouldn't it even be nicer if we could use this soot to manufacture something useful? Japanese scientists have come up not only with a unique technique for effectively collecting diesel soot but also a method for using this soot as a precursor for the production of single-walled carbon nanotubes. How is that as a practical example for green nanotechnology?
One of the problems nanoscientists encounter in their forays into the nanoworld is the issue of accurate temperature measurement. Ever since Galileo Galilei invented a rudimentary water thermometer in 1593, accurate temperature measurement has been a challenging research topic and thermosensing technologies have become a field in their own right. Now, that technology has reached the nanoscale, temperature gradients are becoming essential in areas such as thermoelectricity, nanofluidics, design of computer chips, or hyperthermal treatment of cancer. Currently there is no established method how to measure the temperature gradients at the nanoscale. Most of today's probes are traditional bulk probes, the kind that gets inserted into a sample and measures the temperature. Liquid crystal films which change colors depending on temperature also have at least microscale thickness and lateral dimensions. A recent review addresses these issues and gives an overview of current and future developments for nanoscale temperature measurement technologies.
One of the most common methods of film manufacture is Blown Film Extrusion. The process, by which most commodity and specialized plastic films are made for the packaging industry, involves extrusion of a plastic through a circular die, followed by "bubble-like" expansion. The resulting thin tubular film can be used directly, or slit to form a flat film. Nanoscientists now have found a way to use this very common and efficient industrial technology to potentially solve the problem of fabricating large-area nanocomposite films. Currently, the problems with making thin film assemblies are either the production cost of using complex techniques like wet spinning or the unsatisfactory results of unevenly distributed and lumping nanoparticles within the film. The new bubble film technique results in well-aligned and controlled-density nanowire and carbon nanotubes (CNTs) films over large areas. These findings could finally open the door to affordable and reliable large-scale assembly of nanostructures.
The human body so far is the ultimate 'wet computer' - a highly efficient, biomolecule-based information processor that relies on chemical, optical and electrical signals to operate. Researchers are trying various routes to mimic some of the body's approaches to computing. Prominent among them is DNA computing, a form of computing which uses DNA and molecular biology instead of the traditional silicon-based computer technologies (see our Spotlight: "Molecular automaton plays tic-tac-toe"). Not limited to DNA, "gooware" computer scientists attempt to exploit the computational capabilities of molecules. In doing so, they expect to realize faster (massively parallel), smaller (nanoscale), and cost efficient (energy-saving) information processing devices that are very distinct from today's silicon-based computers.
It has been 25 years since the scanning tunneling microscope (STM) was invented, followed four years later by the atomic force microscope, and that's when nanoscience and nanotechnology really started to take off. Various forms of scanning probe microscopes based on these discoveries are essential for many areas of today's research. Scanning probe techniques have become the workhorse of nanoscience and nanotechnology research. Given the 25-year development timeframe, it is surprising that even today there is no generally accepted standard for scanning probe microscopy (SPM). There is no unified SPM terminology, nor is there a standard for data management and treatment, making access and processing of SPM data collected by different types of instruments an error-prone exercise. SPM standardization has only recently begun as part of an effort by the International Organization for Standardization (ISO) the largest developer of industrial standards. Meanwhile the development of SPM instruments and analysis software continues, increasing the already large family of scanning probe microscopy.
The semiconductor industry is on its way to 32 nm processor technology, expected to be commercialized around 2009, and the day might be near when transistors will reach the limits of miniaturization at atomic levels and put an end to the currently used fabrication technologies. Apart from the issues of interconnect density and heat dissipation, which some researchers hope to address with carbon nanotube-based applications, there is the fundamental issue of quantum mechanics that will kick in once chip design gets down to around 4 nm. This is where semiconductor dimensions have become so small that quantum effects would dominate the circuit behavior. Computer designers usually regard this as a bad thing because it might allow electrons to leak to places where they are not wanted. In particular, the tunneling of electrons and holes - so-called quantum tunneling - will become too great for the transistor to perform reliable operations. The result would be that the two states of the switch could become indistinguishable. Quantum effects can, however, also be beneficial. A group of researchers has now shown that a single bit of data might be stored on, and again retrieved from, a single atom. Just don't expect this in your computer anytime soon, though.
In our May 7 spotlight "The potential and the pitfalls of nanomedicine" we took a general look at the potential implications of nanomedicine and addressed some ethical issues that arise as the technology develops. In part two of this article we now take a closer look at emerging nanomedical techniques such as nanosurgery, tissue engineering, nanoparticle-enabled diagnostics, and targeted drug delivery. Again, the ethical issues inherent in these emerging medical technologies need to be considered. There are established principals for ethical assessment of existing, conventional, medical technologies and a new research article examines if and how these principals can be extended to nanomedicine.
There are quite a number of terms such as bionics, biomimetics, biognosis, biomimicry, or even 'bionical creativity engineering' that refer to more or less the same thing: the application of methods and systems found in nature to the study and design of engineering systems and modern technology. A relatively new entry in this list is 'nanomimetics', an area of biomimetic nanotechnology that tries to duplicate what nature has been doing for billions of years on this planet - creating and manipulating complex nanoscale structures. Nanoscientists and nanotechnology researchers use the terms 'self-assembly' and 'bottom-up fabrication' in their efforts to copy the best nanotechnologist around - Mother Nature. Managing another small step in this direction, researchers now have reported the accurate replication of fragile biological nanoscale shapes normally associated with self-assembly by using a robust top-down lithographic technology. The ability to replicate biological shapes with nanoscale precision could have profound implications in tissue engineering, cell scaffolding, drug delivery, sensors, imaging, and immunology.