Diatoms are a major group of hard-shelled algae and one of the most common types of phytoplankton. A characteristic feature of diatom cells is that they are encased within a unique cell wall made of silica. Silicate materials are very important in nature and they are closely related to the evolution of living organisms. Diatom walls show a wide diversity in form, some quite beautiful and ornate, but usually consist of two symmetrical sides with a split between them, hence the group name. There is great potential for the use of diatoms in nanotechnology. This potential lies in the pores and channels which give rise to a greatly increased surface area, and the silica structure which lends itself to chemical modification. In addition there is a huge variety in the sizes and shapes of diatoms available, providing scope for the selection of a particular species of diatom tailored to a particular requirement. Researchers in the UK have demonstrated that the silica walls of diatoms can be used for the attachment of active biomolecules, such as antibodies, using either primary amine groups or the carbohydrate moiety. These modified structures can, therefore, be used for antibody arrays or for use in techniques such as immunoprecipitation.
One statement of the second law of thermodynamics is that the efficiency of any heat engine or other thermodynamic process is always less that 100%. There will always be some type of friction or other inefficiency that will generate waste heat. The useful work that a heat engine can perform will therefore always be less than the energy put into the system. Engines must be cooled, as a radiator cools a car engine, because they generate waste heat. While there is no way around the second law of thermodynamics, the performance of today's power generation technology is quite appalling. The average efficiency today for fossil-fired power generation, 35% for coal, 45% for natural gas and 38% for oil-fired power generation. By the way, be skeptical when people tell you that nuclear power is good in the fight against global warming - nuclear power plants have a worse thermal efficiency (30-33%) than fossil-fired plants. Approximately 90% of the world's power is generated in such a highly inefficient way. In other words: every year some 15 billion kilowatts of heat is dumped into the atmosphere during power generation (talk about fueling global warming...). This is roughly the same amount as the total power consumption of the world in 2004. Reducing these inefficiencies would go a long way in solving the coming energy and climate problems. Thermoelectric materials - which can directly convert heat into electricity - could potentially convert part of this low-grade waste heat. Problem is that good thermoelectric materials are scarce and so far solid-state heat pumps have proven too inefficient to be practical. Two papers in this week's Nature describe how silicon devices could in principle be adapted and possibly scaled up for this purpose.
Just because hydrogen is a clean fuel doesn't mean that hydrogen production is a clean process. As more and more companies and investors jump onto the 'cleantech' bandwagon, hydrogen occupies an important place in this vision of a sustainable, carbon-free, and non-polluting energy future. If you look closer though, you'll find that you are not always told the full story about "clean" hydrogen. The U.S. department of Energy's Hydrogen Energy Roadmap foresees up to 90% of hydrogen production coming from fossil fuels - coal, gas, oil. In other words, a clean fuel is produced by the same dirty fuel that is causing all the problems we are facing today (read more in our recent Spotlight: Nanotechnology could clean up the hydrogen car's dirty little secret). Hydrogen can be produced in a clean way, of course, but the greatest challenge to clean hydrogen production is cost - so far, the cheapest way today to produce hydrogen is from fossil fuels. And as long as the political will and the resulting large-scale funding isn't there, this won't change. Unfortunately, large-scale deployment of artificial water-splitting technologies looks unlikely given the need for large amounts of expensive precious metals - such as platinum, which currently cost about $45,000 per kilogram, and which will become scarce at some point in the future - required to catalyze the multi electron water-splitting reactions. Intriguingly, there are mechanism of biological hydrogen activation found in nature and researchers have identified several microbes that can activate the dihydrogen bond through the catalytic activity of hydrogenases (enzymes that play a vital role in anaerobic metabolism). Scientists hope that these proteins could one day serve as catalysts for hydrogen production and oxidation in fuel cells. So far, their efforts have been hampered by the difficulty of incorporating these enzymes into electrical devices because the enzymes do not form good electrical connections with fuel cell components. New research now demonstrates the first successful electrical connection between a carbon nanotube and hydrogenase.
Earlier this year, the Science Policy Council of the U.S. Environmental Protection EPA (EPA) issued the final version of its Nanotechnology White Paper. The purpose of this White Paper is to inform EPA management of the science issues and needs associated with nanotechnology, to support related EPA program office needs, and to communicate these nanotechnology science issues to stakeholders and the public. While this has been the publicly most visible EPA activity with regard to nanotechnology, it is less widely known that the EPA, since 2002, has been spending more than $25 million through its Science to Achieve Results (STAR) grants program for 86 projects on research into the environmental aspects of nanotechnology. The projects are broadly grouped into two main categories: 1) nanotechnology applications - examining beneficial uses - where the areas of research include green manufacturing, contamination remediation, sensors for environmental pollutants, and waste treatment; and 2) nanotechnology implications - examining the potentially adverse health effects to humans and the environment - where research is grouped into five categories: aerosol, exposure assessment, fate and transport, life-cycle analysis, and toxicity.
Freshwater looks like it will become the oil of the 21st century - scarce, expensive and fought over. While over 70 per cent of the Earth's surface is covered by water, most of it is unusable for human consumption. According to the Government of Canada's Environment Department (take a look at their Freshwater Website - a great resource for facts and all kinds of aspects about water), freshwater lakes, rivers and underground aquifers represent only 2.5 per cent of the world's total freshwater supply. Unfortunately, in addition to being scarce, freshwater is also very unevenly distributed. The United Nations has compared water consumption with its availability and has predicted that by the middle of this century between 2 billion and 7 billion people will be faced with water scarcity. It gets worse: In the developing countries, 80 per cent of illnesses are water-related. Due to the shortage of safe drinking water in much of the world, there are 3.3 million deaths every year from diarrheal diseases caused by E. coli, salmonella and cholera bacterial infections, and from parasites and viral pathogens. In fact, between 1990 and 2000, more children died of diarrhea than all the people killed in armed conflicts since the Second World War. The use of nanotechnologies in four key water industry segments - monitoring, desalinization, purification and wastewater treatment - could play a large role in averting the coming water crisis. But hoping that the 'magic' of nanotechnology will solve all water problems is naive - the basic problems of accessibility to technologies, affordability, and fair distribution still need to be solved.
A revolutionary new environmental biotechnology - the Microbial Fuel Cell - turns the treatment of organic wastes into a source of electricity. Fuel cell technology, despite its recent popularity as a possible solution for a fossil-fuel free future, is actually quite old. The principle of the fuel cell was discovered by German scientist Christian Friedrich Schoenbein in 1838 and published in 1839. Based on this work, the first fuel cell was developed by Welsh scientist Sir William Robert Grove in 1843. The operating principle of a fuel cell is fairly straightforward. It is an electrochemical energy conversion device that converts the chemical energy from fuel (on the anode side) and oxidant (on the cathode side) directly into electricity. Today, there are many competing types of fuel cells, depending on what kind of fuel and oxidant they use. Many combinations of fuel and oxidant are possible. For instance, hydrogen cell uses hydrogen as fuel and oxygen as oxidant. Other fuels include hydrocarbons and alcohols. An interesting - but not commercially viable yet - variant of the fuel cell is the microbial fuel cell (MFC) where bacteria oxidize compounds such as glucose, acetate or wastewater. Researchers in Spain have fabricated multi-walled carbon nanotube (MWCNT) scaffolds with a micro-channel structure in which bacteria can grow. This scaffold structure could be used as electrodes in microbial fuel cells.
Finding out how much power all the computers in the U.S., not to mention the world, are using seems to be an impossible task. We tried. The latest data from the Department of Energy (DoE) for household computer use is from 2001, for office use, from 1999. This is strange because when you do some back of the envelope calculations you arrive at some pretty staggering numbers. An estimated 1 billion computers in 2008 will use some 200 billion kWh of electricity (that's roughly what all households in New York City combined use over five years), generating about 127 million tonnes of CO2 in the process. And that's just for desktop and laptop computers, not including peripherals or the billions of chips used in other electronic devices. Researchers are now proposing to build a fully mechanical computer based on nanoelectromechanical (NEMS) components that would use considerably less energy. Inspired by a classical mechanical computer design from 200 years ago, the main motivation behind constructing such a computer is threefold: (1) mechanical elements are more robust to electromagnetic shocks than current dynamic random access memory (DRAM) based purely on complimentary metal oxide semiconductor (CMOS) technology, (2) the power dissipated can be orders of magnitude below CMOS and (3) the operating temperature of such an NMC can be an order of magnitude above that of conventional CMOS. Today, such a mechanical computer is only a hypothetical device. However, any effort to reduce the power consumption of computers, and not increase them as happens with every new chip generation, seems like a worthwhile effort.
Back in January, when the U.S. president announced his hydrogen fuel initiative and proposed to spend a total of $1.7 billion over the next five years to develop hydrogen-powered fuel cells, hydrogen infrastructure and advanced automotive technologies, he said that it will be practical and cost-effective for large numbers of Americans to choose to use clean, hydrogen fuel cell vehicles by 2020. According to the U.S. Department of Energy's (DOE) Hydrogen Program, the government's goal is to achieve "technology readiness" by around 2015 in order to allow industry to make decisions on commercialization by then. That's only eight years to go. Given where the technology is today, this goal seems very ambitious, to say the least. Nanotechnology could help speed up the journey to the hydrogen society, but it will take some sensational breakthroughs on the way. The three key areas for the vehicles (we will not touch on the infrastructure issues here) are clean - the emphasis is on clean - hydrogen production, hydrogen storage, and the fuel cell itself. We'll take a look at how nanotechnology will play a role in these areas.