Showing Spotlights 1953 - 1960 of 2275 in category All (newest first):
One of the problems nanoscientists encounter in their forays into the nanoworld is the issue of accurate temperature measurement. Ever since Galileo Galilei invented a rudimentary water thermometer in 1593, accurate temperature measurement has been a challenging research topic and thermosensing technologies have become a field in their own right. Now, that technology has reached the nanoscale, temperature gradients are becoming essential in areas such as thermoelectricity, nanofluidics, design of computer chips, or hyperthermal treatment of cancer. Currently there is no established method how to measure the temperature gradients at the nanoscale. Most of today's probes are traditional bulk probes, the kind that gets inserted into a sample and measures the temperature. Liquid crystal films which change colors depending on temperature also have at least microscale thickness and lateral dimensions. A recent review addresses these issues and gives an overview of current and future developments for nanoscale temperature measurement technologies.
Jun 1st, 2007
One of the most common methods of film manufacture is Blown Film Extrusion. The process, by which most commodity and specialized plastic films are made for the packaging industry, involves extrusion of a plastic through a circular die, followed by "bubble-like" expansion. The resulting thin tubular film can be used directly, or slit to form a flat film. Nanoscientists now have found a way to use this very common and efficient industrial technology to potentially solve the problem of fabricating large-area nanocomposite films. Currently, the problems with making thin film assemblies are either the production cost of using complex techniques like wet spinning or the unsatisfactory results of unevenly distributed and lumping nanoparticles within the film. The new bubble film technique results in well-aligned and controlled-density nanowire and carbon nanotubes (CNTs) films over large areas. These findings could finally open the door to affordable and reliable large-scale assembly of nanostructures.
May 31st, 2007
The human body so far is the ultimate 'wet computer' - a highly efficient, biomolecule-based information processor that relies on chemical, optical and electrical signals to operate. Researchers are trying various routes to mimic some of the body's approaches to computing. Prominent among them is DNA computing, a form of computing which uses DNA and molecular biology instead of the traditional silicon-based computer technologies (see our Spotlight: "Molecular automaton plays tic-tac-toe"). Not limited to DNA, "gooware" computer scientists attempt to exploit the computational capabilities of molecules. In doing so, they expect to realize faster (massively parallel), smaller (nanoscale), and cost efficient (energy-saving) information processing devices that are very distinct from today's silicon-based computers.
May 29th, 2007
It has been 25 years since the scanning tunneling microscope (STM) was invented, followed four years later by the atomic force microscope, and that's when nanoscience and nanotechnology really started to take off. Various forms of scanning probe microscopes based on these discoveries are essential for many areas of today's research. Scanning probe techniques have become the workhorse of nanoscience and nanotechnology research. Given the 25-year development timeframe, it is surprising that even today there is no generally accepted standard for scanning probe microscopy (SPM). There is no unified SPM terminology, nor is there a standard for data management and treatment, making access and processing of SPM data collected by different types of instruments an error-prone exercise. SPM standardization has only recently begun as part of an effort by the International Organization for Standardization (ISO) the largest developer of industrial standards. Meanwhile the development of SPM instruments and analysis software continues, increasing the already large family of scanning probe microscopy.
May 25th, 2007
The semiconductor industry is on its way to 32 nm processor technology, expected to be commercialized around 2009, and the day might be near when transistors will reach the limits of miniaturization at atomic levels and put an end to the currently used fabrication technologies. Apart from the issues of interconnect density and heat dissipation, which some researchers hope to address with carbon nanotube-based applications, there is the fundamental issue of quantum mechanics that will kick in once chip design gets down to around 4 nm. This is where semiconductor dimensions have become so small that quantum effects would dominate the circuit behavior. Computer designers usually regard this as a bad thing because it might allow electrons to leak to places where they are not wanted. In particular, the tunneling of electrons and holes - so-called quantum tunneling - will become too great for the transistor to perform reliable operations. The result would be that the two states of the switch could become indistinguishable. Quantum effects can, however, also be beneficial. A group of researchers has now shown that a single bit of data might be stored on, and again retrieved from, a single atom. Just don't expect this in your computer anytime soon, though.
May 24th, 2007
In our May 7 spotlight "The potential and the pitfalls of nanomedicine" we took a general look at the potential implications of nanomedicine and addressed some ethical issues that arise as the technology develops. In part two of this article we now take a closer look at emerging nanomedical techniques such as nanosurgery, tissue engineering, nanoparticle-enabled diagnostics, and targeted drug delivery. Again, the ethical issues inherent in these emerging medical technologies need to be considered. There are established principals for ethical assessment of existing, conventional, medical technologies and a new research article examines if and how these principals can be extended to nanomedicine.
May 23rd, 2007
There are quite a number of terms such as bionics, biomimetics, biognosis, biomimicry, or even 'bionical creativity engineering' that refer to more or less the same thing: the application of methods and systems found in nature to the study and design of engineering systems and modern technology. A relatively new entry in this list is 'nanomimetics', an area of biomimetic nanotechnology that tries to duplicate what nature has been doing for billions of years on this planet - creating and manipulating complex nanoscale structures. Nanoscientists and nanotechnology researchers use the terms 'self-assembly' and 'bottom-up fabrication' in their efforts to copy the best nanotechnologist around - Mother Nature. Managing another small step in this direction, researchers now have reported the accurate replication of fragile biological nanoscale shapes normally associated with self-assembly by using a robust top-down lithographic technology. The ability to replicate biological shapes with nanoscale precision could have profound implications in tissue engineering, cell scaffolding, drug delivery, sensors, imaging, and immunology.
May 22nd, 2007
In 2005, researchers in the Netherlands developed the concept of a "molecular printboard" (named for its parallels with a computer motherboard) - a monolayer of host molecules on a solid substrate on which guest molecules can be attached with control over position, binding strength, and binding dynamics. Molecules can be positioned on the printboard using supramolecular contact printing and supramolecular dip-pen nanolithography. In this way, nanoscale patterns can be written and erased on the printboard. This technique, which combines top-down fabrication (lithography) with bottom-up methods (self-assembly), has now been applied to proteins. The resulting "protein printboards", allowing the capture and immobilization of proteins with precise control over specificity, strength and orientation, allows the fabrication of protein chips for applications in proteomics. They will play a major role in unraveling the human protein map, just as special chips were instrumental in mapping human DNA.
May 21st, 2007