Open menu

Nanotechnology Spotlight

Behind the buzz and beyond the hype:
Our Nanowerk-exclusive feature articles

RSS Subscribe to our Nanotechnology Spotlight feed

Showing Spotlights 1329 - 1336 of 1700 in category (newest first):

 

Water, nanotechnology's promises, and economic reality

Freshwater looks like it will become the oil of the 21st century - scarce, expensive and fought over. While over 70 per cent of the Earth's surface is covered by water, most of it is unusable for human consumption. According to the Government of Canada's Environment Department (take a look at their Freshwater Website - a great resource for facts and all kinds of aspects about water), freshwater lakes, rivers and underground aquifers represent only 2.5 per cent of the world's total freshwater supply. Unfortunately, in addition to being scarce, freshwater is also very unevenly distributed. The United Nations has compared water consumption with its availability and has predicted that by the middle of this century between 2 billion and 7 billion people will be faced with water scarcity. It gets worse: In the developing countries, 80 per cent of illnesses are water-related. Due to the shortage of safe drinking water in much of the world, there are 3.3 million deaths every year from diarrheal diseases caused by E. coli, salmonella and cholera bacterial infections, and from parasites and viral pathogens. In fact, between 1990 and 2000, more children died of diarrhea than all the people killed in armed conflicts since the Second World War. The use of nanotechnologies in four key water industry segments - monitoring, desalinization, purification and wastewater treatment - could play a large role in averting the coming water crisis. But hoping that the 'magic' of nanotechnology will solve all water problems is naive - the basic problems of accessibility to technologies, affordability, and fair distribution still need to be solved.

Posted: Aug 15th, 2007

Microbotics - nanoparticles hitching a ride on bacteria

Vaccination has resulted in the eradication of smallpox and control of measles, rubella, tetanus, diphtheria, and other infectious diseases in many areas of the world (at least where vaccines are available and affordable; providing vaccines to many parts of the developing world still is one of the basic medical needs that is far from being met). The basic idea of vaccination (the word comes from the Latin vacca - cow - because the first vaccine was derived from a virus affecting cows) is to inject weakened or killed forms of pathogens such as bacteria or viruses into the body in order for the immune system to develop antibodies against them; if the same types of microorganisms enter the body again, they will be destroyed by the antibodies. About 25 years ago, the basic idea of vaccination gave rise to bactofection - the technique of using bacteria as non-viral gene carriers into target cells. The DNA cargo is transported inside the bacteria and, once it arrives at the target location, the bacteria is broken up in order to release the therapeutic gene or protein. A novel technique takes advantage of the invasive properties of bacteria for delivery of nanoparticles into cells. Here, the gene or cargo is not carried inside the bacteria, but rather remains on the surface conjugated to nanoparticles. Consequently, this approach does not require bacterial disruption for delivery, or any genetic engineering of the bacteria for different cargo.

Posted: Aug 14th, 2007

Nanoscopy - nanoscale resolution in light microscopy

In the early 1870s, the German physicist Ernst Karl Abbé formulated a rigorous criterion for being able to resolve two objects in a light microscope. According to his equation, the best resolution achievable with visible light is about 200 nanometers. This theoretical resolution limit of conventional optical imaging methodology was the primary factor motivating the development of recent higher-resolution scanning probe techniques. The interaction of light with an object results in the generation of what is called 'near-field' and 'far-field' light components. The far-field light propagates through space in an unconfined manner and is the visible light utilized in conventional light microscopy. The near-field (or evanescent) light consists of a nonpropagating field that exists near the surface of an object at distances less than a single wavelength of light. So called near-field microscopy beats light's diffraction limit by moving the source very close to the subject to be imaged. When the first theoretical work on a new technique called "scanning near-field optical microscopy" (SNOM or NSOM) appeared in the 1980's, Abbé's classical diffraction limit was overcome, and resolution even down to single molecule level became feasible. However, light microscopy is still the only way to observe the interior of whole, or even living, cells. The use of fluorescent dyes makes it possible to selectively obtain images of individual cell components, for example, proteins. Today, the wavelength dogma has been overcome with the development of the stimulated emission depletion (STED) microscope. Now, the German team that developed STED is reporting layer-by-layer light microscopic nanoscale images of cells and without having to prepare thin sections with a technique called optical 3D far-field microscopy. They use a chemical marker for fluorescence nanoscopy that relies on single-molecule photoswitching.

Posted: Aug 13th, 2007

From waste to power in one step

A revolutionary new environmental biotechnology - the Microbial Fuel Cell - turns the treatment of organic wastes into a source of electricity. Fuel cell technology, despite its recent popularity as a possible solution for a fossil-fuel free future, is actually quite old. The principle of the fuel cell was discovered by German scientist Christian Friedrich Schoenbein in 1838 and published in 1839. Based on this work, the first fuel cell was developed by Welsh scientist Sir William Robert Grove in 1843. The operating principle of a fuel cell is fairly straightforward. It is an electrochemical energy conversion device that converts the chemical energy from fuel (on the anode side) and oxidant (on the cathode side) directly into electricity. Today, there are many competing types of fuel cells, depending on what kind of fuel and oxidant they use. Many combinations of fuel and oxidant are possible. For instance, hydrogen cell uses hydrogen as fuel and oxygen as oxidant. Other fuels include hydrocarbons and alcohols. An interesting - but not commercially viable yet - variant of the fuel cell is the microbial fuel cell (MFC) where bacteria oxidize compounds such as glucose, acetate or wastewater. Researchers in Spain have fabricated multi-walled carbon nanotube (MWCNT) scaffolds with a micro-channel structure in which bacteria can grow. This scaffold structure could be used as electrodes in microbial fuel cells.

Posted: Aug 9th, 2007

Nanotechnology researchers go ballistic over graphene

Carbon comes in many different forms, from the graphite found in pencils to the world's most expensive diamonds. In 1980, we knew of only three basic forms of carbon, namely diamond, graphite, and amorphous carbon. Then, fullerenes and carbon nanotubes were discovered and all of a sudden that was where nanotechnology researchers wanted to be. Recently, though, there has been quite a buzz about graphene. Discovered only in 2004, graphene is a flat one-atom thick sheet of carbon. Existing forms of carbon basically consist of sheets of graphene, either bonded on top of each other to form a solid material like the graphite in your pencil, or rolled up into carbon nanotubes (think of a single-walled carbon nanotube as a graphene cylinder) or folded into fullerenes. Physicists had long considered a free-standing form of planar graphene impossible; the conventional wisdom was that such a sheet always would roll up. Initially using such high-tech gadgets like pencils and sticky tape to strip chunks of graphite down to layers just one atom thick, the process has now been refined to involve more expensive instruments such as electron beam and atomic force microscopes. Despite being isolated only three years ago, graphene has already appeared in hundreds of papers. The reason scientists are so excited is that two-dimensional crystals (it's called 2D because it extends in only two dimensions - length and width; as the material is only one atom thick, the third dimension, height, is considered to be zero) open up a whole new class of materials with novel electronic, optical and mechanical properties.

Posted: Aug 8th, 2007

Feeling your way through the nanoworld

"Children begin to learn by seeing, hearing, tasting and, above all, by touching. In a very similar approach, we are currently learning to orient ourselves in the nanoworld by 'feeling' materials - not with our fingers, but with microscopes that allow us to probe these materials with atomic resolution." (Robert W. Stark, LMU Munich in "Getting a feeling for the nanoworld"). Researchers' ability to engineer materials and achieve superior electronic, thermal, magnetic, and mechanical properties depends on tools that can identify and characterize material components and their spatial arrangement at the nanoscale. Equally important, understanding structure and function relationships in biological systems also demands tools that can probe structural properties with molecular resolution. Atomic force microscopes (AFM) are the most widely used tools to image matter at the nanoscale. Due to its mechanical operation, the AFM can in principle also perform nanomechanical measurements. This aspect of the AFM has been explored by researchers over the past two decades. However, current state-of-the-art techniques are very slow (it takes about one second for the AFM tip to approach, push into and retract from the surface of a material) and they apply rather large forces during the measurement process that damage the tip and the sample. Researchers at Harvard and Stanford universities have developed a specially designed AFM cantilever tip, the torsional harmonic cantilever (THC), which offers orders of magnitude improvements in temporal resolution, spatial resolution, indentation and mechanical loading compared to conventional tools. With high operating speed, increased force sensitivity and excellent lateral resolution, this tool facilitates practical mapping of nanomechanical properties.

Posted: Aug 7th, 2007

The long road to molecular electronics could be paved with DNA

One of the many fascinating concepts in nanotechnology is the vision of molecular electronics. If realized, the shift in size from even the smallest computer chip today would be staggering - a quantum leap, so to speak (literally). Look at it this way: a single drop of water contains more molecules than the billions and billions of silicon chips ever produced. Molecular electronics engineers of tomorrow might use individual molecules to perform the functions in an electronic circuit that are performed by semiconductor devices today. Don't get your hopes up, though, that your next iPod will be truly nano. Scientists today are still struggling with the most basic requirements for molecular electronics, for instance, how to precisely and reliably position individual molecules on a surface. DNA-based nanostructuring is one approach that could lead to promising results. It has already been shown that DNA could be used to structure nanoscale surfaces. Now, a team in Germany has demonstrated that nanoscale objects of very different size can be deposited simultaneously and site-selectively onto DNA-displaying surfaces, based on sequence-specific DNA-DNA duplex formation.

Posted: Aug 6th, 2007

Surely you're joking, Mr. Feynman!

Having just re-read Richard Feynman's 20-year old autobiography titled Surely You're Joking, Mr. Feynman! (Adventures of a Curious Character) I thought it makes for a great little Nanowerk Spotlight leading into the weekend - and it won't be about nanotechnology. Feynman's 1959 lecture "Plenty of room at the bottom" is probably the most famous and most quoted physics speech ever and it is the one thing that most non-scientists associate with his name. Feynman, who received the Nobel Prize in Physics in 1965 for his work on on quantum electrodynamics, participated in the Manhattan Project and was a member of the panel that investigated the Space Shuttle Challenger disaster in 1986. He taught physics, first at Cornell and later at the California Institute of Technology. In typical Feynman fashion, a major factor in his decision of chosing CalTech over other institutions was a desire to live in a mild climate, a goal he chose while having to put snow chains on his car's wheels in the middle of a snowstorm in Ithaca, New York. What makes this book such a gem is the weird and wacky collection of anecdotes that Feynman serves up when leading us through his childhood, education and career. Whether he learns how to pick locks and crack safes, plays the bongo drums in an orchestra, gets a commission to paint a naked female toreadore, or competes in a samba competition during Carnival in Rio, the book is not about physics, but the physicist. Underneath all these hilarious stories, though, are recurring leitmotifs of curiosity, tenacity, and total disrespect for ideas that have no grounding in science. For everyone who is quoting Feynman's speech, or who is reading it, this autobiography goes a long way explaining the unconventional mind behind his revolutionary ideas.

Posted: Aug 3rd, 2007