Two of the major challenges of our modern, mobile society are the shrinking of available fossil energy resources on one hand and climate change associated with global warming on the other. Continuing population growth multiplied by the increase in consumption and living standards, especially in developing countries, will require more and more oil, coal and natural gas to 'power' humanity. Notwithstanding efforts like the Kyoto Protocol - which wasn't signed by the two major CO2 polluters China and the U.S. - an ever increasing rate of fossil fuel usage means that the increasing emission of CO2 is likely to cause an acceleration of the climate change that is in progress already. Transportation, in particular passenger cars, is one of the areas where new technology could lead to environmental beneficial change. Never mind that GM is still selling 15-20,000 Hummers a year, or that Tata is planning to sell millions of its new Nano car. One of the much touted technological solutions is to substitute fossil hydrocarbon based energy with the energy from carbon-free sources like the sun, nuclear energy, or the hot interior of the Earth and use hydrogen as an energy carrier. Hydrogen can be produced from water using energy from carbon-free sources and can serve as fuel in fuel cells to generate electricity, either stationary or on board of vehicles. Considerable research efforts are going into the evaluation of various nanostructures, such as carbon nanotubes, to find the most suitable hydrogen storage materials.
Toxicology is an interdisciplinary research field concerned with the study of the adverse effects of chemicals on living organisms. It applies knowledge, methods and techniques from such fields as chemistry, physics, material sciences, pharmacy, medicine and molecular biology. Toxicology established itself in the last 25-30 years as a testing science in the course of efforts of industrial nations to regulate toxic chemicals. Particle toxicology, as a subdiscipline, developed in the context of lung disease arising from inhalation exposure to dust particles of workers in the mining industry. It later expanded to the area of air pollution. With the rapid development of nanotechnology applications and materials, nanotoxicology is emerging as an important subdiscipline of nanotechnology as well as toxicology. Most, if not all, toxicological studies on nanoparticles rely on current methods, practices and terminology as gained and applied in the analysis of micro- and ultrafine particles and mineral fibers. Together with recent studies on nanoparticles, this has provided an initial basis for evaluating the primary issues in a risk assessment framework for nanomaterials. However, current toxicological knowledge about engineered nanoparticles is extremely limited and traditional toxicology does not allow for a complete understanding of the size, shape, composition and aggregation-dependent interactions of nanostructures with biological systems. An understanding of the relationship between the physical and chemical properties of nanostructures and their in vivo behavior would provide a basis for assessing toxic response and more importantly could lead to predictive models for assessing toxicity.
Most people when they hear the word semiconductor will think about their role in computers. However, semiconductors also absorb light, some absorb in the visible, thus appearing colored, e.g. gray silicon, and others in the UV, such as titanium dioxide, thus appearing white (when in microparticulate form) or colorless (when in nanoparticulate form). This light-absorbing feature is used to drive electrons around a circuit in photovoltaic cells, such as the silicon solar cell, but it can also be used to drive chemical reactions at the surface. A good example of the latter is the use of thin (15 nm) titanium dioxide film coatings on self-cleaning glass. These films upon absorbing UV light in sunlight are able to reduce oxygen, present in air, to water and oxidize any organic material on its surface to its minerals, thereby keeping the surface clean. Researchers in the UK have used this oxidation feature to developed an irreversible solvent-based blue ink, which upon activation with UV light, loses all its color and becomes oxygen sensitive; it will only gain its original color upon exposure to oxygen. A major application area for this oxygen ink is in food packaging where it could be used to detect a modified atmosphere inside food containers.
One statement of the second law of thermodynamics is that the efficiency of any heat engine or other thermodynamic process is always less that 100%. There will always be some type of friction or other inefficiency that will generate waste heat. The useful work that a heat engine can perform will therefore always be less than the energy put into the system. Engines must be cooled, as a radiator cools a car engine, because they generate waste heat. While there is no way around the second law of thermodynamics, the performance of today's power generation technology is quite appalling. The average efficiency today for fossil-fired power generation, 35% for coal, 45% for natural gas and 38% for oil-fired power generation. By the way, be skeptical when people tell you that nuclear power is good in the fight against global warming - nuclear power plants have a worse thermal efficiency (30-33%) than fossil-fired plants. Approximately 90% of the world's power is generated in such a highly inefficient way. In other words: every year some 15 billion kilowatts of heat is dumped into the atmosphere during power generation (talk about fueling global warming...). This is roughly the same amount as the total power consumption of the world in 2004. Reducing these inefficiencies would go a long way in solving the coming energy and climate problems. Thermoelectric materials - which can directly convert heat into electricity - could potentially convert part of this low-grade waste heat. Problem is that good thermoelectric materials are scarce and so far solid-state heat pumps have proven too inefficient to be practical. Two papers in this week's Nature describe how silicon devices could in principle be adapted and possibly scaled up for this purpose.
Transistors are the key elements of many types of electronic (bio)sensors. Since the discovery that individual carbon nanotubes (CNTs) can be used as nanoscale transistors, researchers have recognized their outstanding potential for electronic detection of biomolecules in solution, possibly down to single-molecule sensitivity. To detect biologically derived electronic signals, CNTs are often - but not always - functionalized with (conductive) linkers such as proteins and peptides to interface with soluble biologically relevant targets (linkers need not be conductive as long as they are capable of localizing the target molecule in close vicinity of the tube). Although CNT transistors have been used as biosensors for some years now, the ultimate single-molecule sensitivity, which is theoretically possible, has not been reached yet. One of the reasons that hampers the full exploitation of these promising nanosensors is that the sensing mechanism is still not well understood. Although a variety of different sensing mechanisms has been suggested previously, various studies contradict one another, and the sensing mechanism remained under debate. Researchers in The Netherlands - through modeling and specific control experiments - now have succeeded in identifying the sensing mechanism. They found that the majority of their experiments can be explained by a combination of electrostatic gating and Schottky barrier effects. Because these two mechanisms have different gate-potential dependence, the choice of gate potential can strongly affect the outcome of real-time biosensing experiments.
If you grew up in the 80's, chances are you are familiar with the addictive game 'Pipe Dream' in which a plumber tries to lay pipe before water flowing through the pipe can overwhelm him. Get ready to play this at the nanoscale (well, kind of). Although the possibility of connecting carbon nanotubes has been an intriguing one for nanotechnology researchers, realizing this feat had proven to be difficult. Since a carbon nanotube (CNT) is made of graphite, basically a rolled up sheet of graphene, the tubes have been believed to be hard and brittle. Therefore there was not much interest in trying to shape or form them. Scientists in Japan have taken a novel approach to this problem and indeed succeeded in shaping and connecting carbon nanotubes like water pipes. Their simple method will allow longer and multi-branched CNTs with serial junctions to be made by repeated joining, and may have uses for different applications. Not only does the bottom-up engineering of nanotube structures become possible (e.g. simply increasing their aspect ratio) but it could pave the way to an entirely new class of bottom-up-engineered nanostructures and integrated carbon nanotube devices.
Ever since the nanoworld got excited over carbon nanotubes there has been great interest, and progress, in the development of new nanotubes based on metal oxides, sulfides, nitrides, elemental species and others. The characteristic that all these tubular structures have in common is a hollow morphology which may possess circular, or square-like or hexagonal-like cross section. In a standard tubular structure, a cavity is located at the center and extends over the entire length, so that the tube cavity and the tube wall have the same symmetry axis. Structures in which an internal cavity strongly deviates from the center of symmetry towards one side are rather rare. Researchers have now synthesized novel, unconventional nanotubes that are distinctly different from any previously reported nano- and microtubes. These tubes display flattened and thin belt- or ribbon-like morphologies, which are not common for any known tubular structures. This may represent a new, interesting growth phenomenon for tubular crystal structures.
A memory chip is an integrated circuit made of millions of transistors and capacitors. In the most common form of computer memory, dynamic random access memory (DRAM), a transistor and a capacitor are paired to create one memory cell, which represents a single bit of data. The capacitor holds the bit of information, either a 0 or a 1. The transistor acts as a switch that lets the control circuitry on the memory chip read the capacitor or change its state. Because each bit stored in a chip is controlled by one transistor, memory capacities tend to expand at the same pace as the number of transistors per chip - which still follows Moore's Law and therefore currently doubles every 18 months. The problem is that the capacitor - consisting of two charged layers separated by an insulator - can shrink only so far. The thinner insulators get the more they allow charges to tunnel through. Tunneling increases the leakage current, and therefore the standby power consumption. Eventually the insulator will break down. Researchers have been trying to develop electromechanically driven switches that can be made small enough to be an alternative to transistor-switched silicon-based memory. Electromechanical devices are suitable for memory applications because of their excellent ON-OFF ratios and fast switching characteristics. With a mechanical switch there is physical separation of the switch from the capacitor. This makes the data leakage problem much less severe. Unfortunately they involve larger cells and more complex fabrication processes than silicon-based arrangements and therefore have not been so far an alternative to scaling down beyond semiconductor transistors. Researchers now have reported a novel nanoelectromechanical (NEM) switched capacitor structure based on vertically aligned multiwalled carbon nanotubes (CNTs) in which the mechanical movement of a nanotube relative to a carbon nanotube based capacitor defines ON and OFF states.