In an earlier Spotlight we reported on NIOSH's Nanotechnology Research Center (NTRC) and its efforts concerning the occupational safety and health perspectives of engineered nanomaterials (Nanotechnology in the workplace). Today, we are looking at the specific steps undertaken by companies active in the field. "We were receiving a steady stream of questions from industry and academia regarding what we knew about the hazards of nanomaterials," Charles L. Geraci, Branch Chief at the National Institute of Occupational Safety and Health (NIOSH) and Co-Coordinator of the NIOSH Nanotechnology Field Team, tells Nanowerk. "People were coming to NIOSH for recommendations; we knew we needed to have a better understanding of the nature of workplace exposure during research, production and use." But, in a new and relatively little studied area of industry, where does one find these answers? NIOSH already had a strong research program to address questions in the lab, says Geraci, but field data was needed to have a complete picture. "In our minds, the best way to achieve this was to do what NIOSH does best: get in the field and gather data through observation and measurement." In 2006, the concept of a field team dedicated to this effort was developed.
Nanoimprint lithography (NIL) has developed into a key technique for the fabrication of polymer nanopatterns and three-dimensional (3D) nanostructures. At its core, NIL is a simple nanometer scale pattern transfer process where a master mold with a desired pattern is used to fabricate identical patterns in an imprint resist, typically a polymer, with subsequent heat or light curing of the resulting mold. The attractivity of NIL comes from its capability for patterning with high resolution, high fidelity, high throughput, and low cost. Using NIL, nanometer sized patterns can easily be formed on various substrates, including silicon wafers, glass plates, flexible polymer films, and even nonplanar substrates. The limitation of conventional NIL techniques lies in their resulting patterned 2D layers; the formation of 3D micro- and nanostructures by stacking the 2D layers cannot be achieved by conventional NIL. That's why researchers came up with reverse nanoimprint lithography, a technique to transfer patterned 2D layers and to form multistacked 3D micro- and nanostructures on the substrate. While this works in principal, the achievable yields are very low due to the difficulty of detaching the master mold from the 3D structure. Researchers in South Korea have now managed to demonstrate the first successful fabrication of multi-stacked 2D nano patterned slabs on various substrates including flexible polymer film. This means real 3D nano structures such as photonic crystals can be fabricated with reasonable cost.
Individual nanoscale building blocks, such as nanoparticles, nanosheets, nanowires or nanotubes display unique and unusually impressive mechanical properties. These mechanical properties of nanomaterials cannot be extrapolated from their bulk properties and scientists are still busily exploring the nanoscale behavior of various materials. Once the nanoscale properties of a material are known, the next problem is how to practically exploit certain properties and transfer them back into a macro structure of a bulk material. Materials engineers would love to transfer the exceptional mechanical properties such as tensile strength and Young's modulus (a measure of stiffness that reflects the resistance of a material to elongation) of nanoscale materials into nanocomposites that then cold be used to build much tougher, lighter and flexible materials than anything we know today - say a paper thin sheet of nanocomposite material that is transparent, flexible, yet as strong as steel. So far, effective load transfer and homogenous dispersion appear to be the key issues in order to take advantage of the extraordinary properties of nanomaterials for mechanical reinforcement applications. New research coming out of the University of Michigan has resulted in nanocomposite materials with a very high content of inorganic component and nearly perfect stress transfer. The stiffness and tensile strength of these multilayer composites are an order of magnitude greater than those of analogous nanocomposites at a processing temperature that is much lower than those of ceramic or polymer materials with similar characteristics.
Modern nanotechnology researchers not only borrow extensively from nature to develop new materials and fabrication techniques, they also manage to transfer proven, and sometimes ancient, technologies into their nanotechnology laboratories. We've written about this before in our stories about welding ("Bronze Age technique works just fine in the nanotechnology era") and metal forging ("From Bronze Age shack to nanotechnology lab - metal forging techniques reach the bottom"). Today, our story deals with yet another ur-technology: spinning. Spinning is the process of creating yarn (or thread, rope, cable) from various raw fiber materials. The first spinning wheel was invented in India probably some 2,500-3,000 years ago, although some claim that the Chinese used similar devices as long as almost 5,000 years ago to spin silk threads. While spinning is one of the core technology foundations of our civilization, researchers have now begun to apply cotton-spinning techniques to fabricate carbon nanotube (CNT) "yarn."
Shuttles - whether the space shuttle, an airport shuttle bus, or a loom shuttle - basically do one thing: they transport cargo (astronauts, passengers, thread) from one point to another on a controlled route. Although not always called shuttles, the basic concept is critical to modern transportation systems and is used by nearly every society. The concept of the shuttle has been used for centuries from Egyptian barges to Roman railways and canals. Even before these inventions, however, nature employed molecular shuttles in biological organisms. In molecular shuttles, kinesin proteins propel cargo (such as organelles) along hollow tubes called microtubules. Cells use these motors to transport cargo to highly specific destinations, in order to regulate levels of macromolecules and processes, much like a train along a track. Using biological motors to transport and precisely distribute cargo requires a clear understanding of how molecular shuttles pick up and deliver specific payload. However, scientists are challenged by the need to better control the interactions along the route so that the cargo remains on the line when not needed, but when it is needed, can be picked up and transported to a specific location. Researchers in Switzerland have now built nanoscale cargo loading stations and shuttles, an important step towards assembly lines for nanotechnology.
The need for 3D visualization and analysis at high spatial resolution is likely to increase as nanoscience and nanotechnology become increasingly important and nanotomography could play a key role in understanding structure, composition and physico-chemical properties at the nanoscale. Scientists from the Electron Microscopy Group at the University of Cambridge in the UK report that nanotomography is becoming an important tool in the study of the size, shape, distribution and composition of various materials, including nanomaterials.
In our Spotlight from a few weeks ago - Nanopyramids - temporary resting places for light - we wrote about things like Q-factors, qubits, stopping light and other fascinating concepts and emerging techniques that could lead to quantum computing. Some of the feedback we received could be summarized with "Huh? Q-what...?" So today we'll take another look at the Quality (Q) factor of photonic-crystal nanocavities and the context it is relevant in. These 'nanocages' for light are currently the focus of much interest of nanotechnology research in photonics because they can strongly confine photons in a tiny space. Just as semiconductor crystals control the flow of electrons (the basis for all electronics), photonic crystals are a unique material used to construct photonic devices and circuits for manipulating light, i.e. photons. A prominent example of a photonic crystal is the naturally occurring gemstone opal. Photons (behaving as waves) propagate through it - or not - depending on their wavelength. Wavelengths of light (stream of photons) that are allowed to travel throught he crystal are known as "modes". Disallowed bands of wavelengths are called photonic band gaps. What's so interesting for researchers is that, once you are able to fully control and manipulate photons, you could not only vastly improve existing applications like optical data storage, high-precision sensing and telecommunications, but develop exotic technologies like quantum computing.
Every few months you can read about a recall of food items, be it fresh spinach of bottled infant formula, due to contamination with salmonella, E. coli or some other foodborne pathogen. A pathogen is a an organism that causes disease in another organism. The Centers for Disease Control and Prevention (CDC) estimates that foodborne diseases cause 76 million illnesses, 325,000 hospitalizations, and 5,000 deaths each year. In 2000, the U.S. Department of Agriculture (USDA) estimated the costs associated with five major bacterial foodborne pathogens to be $6.9 billion. The Food and Drug Administration's 2005 Food Code states that the estimated cost of foodborne illness is $10-$83 billion annually. These staggering numbers, not to mention the potential of foodborne pathogens for terrorists attacks, are driving the development of biosensors as important analytical tools for the rapid detection of pathogens in the field, and they are increasingly playing a key role in controlling disease outbreaks. Immunosensors (biosensors that use antibodies as biological recognition elements) are of great interest because of their applicability (any compound can be analyzed as long as specific antibodies are available) and high sensitivity. High sensitivity, of course, is an important attribute in designing biosensors, but a large variance due to stochastic interaction between biomolecules, biosensor imperfections, and environmental variability (e.g., pH of the analyte) directly affects the reliability of the measured signal in almost all sensors. Researchers have now taken forward error-correction (FEC) techniques, already successfully applied for improving reliability of communication and storage systems such as CDs, and applied them to biosensors.