The human body so far is the ultimate 'wet computer' - a highly efficient, biomolecule-based information processor that relies on chemical, optical and electrical signals to operate. Researchers are trying various routes to mimic some of the body's approaches to computing. Especially research related to molecular logic gates is a fast growing and very active area. Already, common logic gates, which are used in conventional silicon circuitry, can be also mimicked at the molecular level. Chemists have reported that a molecular logic gate has the potential for calculation on the nanometer scale, which is unparalleled in silicon-based devices. The general character of the concept of binary logic allows the substitution of electrical signals by chemical and optical signals, which for example opens access to a vast pool of photoactive molecules to be used for the purpose of molecular logic. Molecular logic gate structures using fluorescence changes have been studied intensively using various inputs, such as pH, metal ions, and anions. Now, South Korean scientists using solutions of fluorescent sensor molecules - and, for the first time, proteins - have developed the first soluble molecular logic gates. By using a microfluidic device, input solutions are routed into a central loop, which is filled with a fluorescent sensor solution. There the solutions mix and, in certain combinations, switch the fluorescence 'output' on or off.
Talking about the threat of terrorists using bioweapons is a great tool for scaring people. Using any kind of pathogen (bacterium, virus or other disease-causing organism) as a weapon certainly is a terrifying scenario; think about the near-panic the 2001 anthrax attacks in the United States caused. Letters containing anthrax spores were mailed to several news media offices and two U.S. Senators, killing five people and infecting 17 others. Can you image what panic would result from an attack that kills 5,000 people and causes 76 million illnesses? Well, as a matter of fact, foodborne diseases cause approximately 76 million illnesses, 325,000 hospitalizations, and 5,000 deaths in the United States each year. Known pathogens account for an estimated 14 million illnesses, 60,000 hospitalizations, and 1,800 deaths. The Food and Drug Administration's 2005 Food Code states that the estimated cost of foodborne illness is $10-$83 billion annually. So while the U.S. spends billions of dollars securing its borders, it loses many more billions, not to mention thousands of lives, every year by not being able to keep its spinach and hamburgers safe. Apparently, talking about terrorism is much better political theater (and makes for catchier Nanowerk Spotlight titles) than discussing E. coli outbreaks. However, be it because of potential terrorists or actual contaminated food, research in microbial detection and decontamination processes increased significantly over the past years. Traditional methods of identifying and subsequently removing a pathogen are slow and cumbersome. Now, using nanotechnology, researchers have designed a novel biosensing system that can identify E. coli in just five minutes and remove up to 88% of the target bacteria.
Nanotechnology's poster child, the carbon nanotube (CNT), has been explored for use in many technical applications. Increasingly, researchers are also looking at the unique biological properties of CNTs for potential biomedical uses. For instance, the interaction between DNA and CNTs have been explored and DNA-functionalized nanotubes hold significant promise as nucleic acid sensors. Nanotubes have also been considered for use as scaffolds for cells in tissue engineering. No matter what their intended function, any material used in medicine must exhibit - among other compatibility factors - biocompatibility, non-toxicity and non-carcinogenicity. And here the jury is still out as far as CNTs are concerned. One limiting factor of toxicological studies so far has been the use of animal tissue rather than living specimen. Researchers have now succeeded in detecting single-walled CNTs (SWCNTs) inside living animals - with surprisingly benign results - paving the way for future research on the effects and fate of nanotubes inside living organisms.
Proteins, large organic compounds made of amino acids, provide many of the most basic units of function in living systems. They make up about half of the dry mass of animals and humans. There may be as many as 1 million different types of proteins in the human body (it is estimated that the human proteome is comprised of an average of 5-7 protein isoforms per open reading frame in the human genome and a further 600 000-odd immunoglobulins present in serum at any given moment) - nobody really knows. The word protein comes from the Greek prota, meaning 'of primary importance', and they actually may become of great importance in nanoscale fabrication as well. Proteins have an amazing number of functions inside our bodies: Enzymes serve as catalysts to break down food into various components; transport proteins such as hemoglobin transport molecules (e.g. oxygen); storage proteins store molecules (e.g. iron is stored in the liver as a complex with the protein ferritin); structural proteins such as keratin or collagen are needed for mechanical support in tissues like cartilage and skin but also hair and nails; proteins are the major component of muscles and for instance actin or myosin are key to contracting muscle fibers; hormones control the growth of cells and their differentiation; antibody proteins are needed for immune protection; and toxins are, well, toxic, but in minute amounts could have beneficial medical properties. Scientists believe that this variety of natural protein functions - actuation, catalysis, structural transport and molecular sequestering - could serve as valuable and versatile building blocks for synthesis of functional materials. Researchers now have found that nanometer-scale changes in protein conformation can be translated into macroscopic changes in material properties. The result is a new class of dynamic, protein-based materials.
Cancer is an enormous socio-economic problem. According to the National Cancer Institute (NCI), it is estimated that in 2007 there will be over 1.4 million new cases of cancer (of any type) and over 550,000 deaths from cancer in the United States (you can download a detailed Cancer Statistics 2007 Presentation; ppt download, 808 KB) from the American Cancer Society. This makes cancer the second deadliest disease category, after heart diseases. But while the mortality rates for heart diseases have dropped by more than half from 1950 to 2004, and other major disease categories show similar trends, cancer death rates have stayed pretty much the same. Shocking but true, if you are a male living in the U.S., your lifetime probability of developing some type of cancer is 1 in 2. If you are female, your probability is 1 in 3. Equally dismal are the economic cost associated with this disease: The amount of direct cancer-related costs (treatment, care and rehabilitation) have reached $74 billion in the U.S. in 2005, and growing fast, while the overall economic costs (including loss of economic output due to days off and premature death) are estimated to be over $200 billion per year (2005 data). This Spotlight will discuss existing and new approaches to fight cancer and their limitations. The goal is to stimulate readers to support and participate in interdisciplinary research and teaching efforts toward relieving suffering and death due to cancer. Fighting cancer involves three phases: (i) detection, (ii) treatment, and (iii) monitoring. Success depends on matching science to the actual practical needs. We'll take a look at - in particular nanotechnology - efforts underway in the direction of these three phases and comment on some of the practical problems encountered fighting cancer. We also speculate about some unconventional research that might be successful fighting cancer in the future.
Nanotechnology enabled synthetic biology could one day lead to an artificial construct that operates like a living cell. That day might be a considerable distance off, given the difficulties scientists are still having in even understandiing the organizing principles and workings of a cell, not to mention being able to duplicate cell components and assembling them into a working whole. The large discrepancy between the functional density (i.e., the number of components or interconnection of components per unit volume) of cells and engineered systems highlights the inherent challenges posed by such a task. Just take 'simple' bacteria like Escherichia coli (which has an approx. 2 square micrometer cross-sectional area). The E. coli cell has some 4.6-million base-pair chromosome (the equivalent of a 9.2 megabit memory) that codes for as many as 4,300 different polypeptides under the inducible control of several hundred different promoters. The most advanced silicon chips will be able in a few years time to come close to this performance (on the other hand, you have several trillion E. coli in your gut; you would need to swallow a lot of computer chips to match this combined 'computing' power). Another way to look at the synthetic cell challenge is to regard the cellular environment as a highly complex synthetic medium, in which numerous multistep reactions take place simultaneously with an efficiency and specificity that scientists are not capable of duplicating at this scale. Researchers in The Netherlands have now succeeded in constructing nanoreactors that can be used to perform one-pot multistep reactions - another step towards the goal of artificial cell-like devices, but more promising in the short term for screening and diagnostic applications.
What do humans have in common with the pinky-sized tropical zebrafish that zip around in many hobbyists' home aquariums? Well, surprising as it may be, quite a lot actually. Zebrafish share the same set of genes as humans and have similar drug target sites for treating human diseases. For this reason, scientists, when turning to a model-organism to help answer genetic questions that cannot be easily addressed in humans, often chose the zebrafish (Danio rerio) - and save a few mice in the process. Zebrafish are small, easy to maintain, and well-suited for whole animal studies. Furthermore, its early embryonic development is completed rapidly within five days with well-characterized developmental stages. The embryos are transparent and develop outside of their mothers, permitting direct visual detection of pathological embryonic death, mal-development phenotypes, and study of real-time transport and effects of nanoparticles in vivo. Therefore, zebrafish embryos offer a unique opportunity to investigate the effects of nanoparticles upon intact cellular systems that communicate with each other to orchestrate the events of early embryonic development. In a new study, researchers explore the potential of nanoparticles as in vivo imaging and therapeutic agents and develop an effective and inexpensive in vivo zebrafish model system to screen biocompatibility and toxicity of nanomaterials. Such real-time studies of the transport and biocompatibility of single nanoparticles in the early development of embryos will provide new insights into molecular transport mechanisms and the structure of developing embryos at nanometer spatial resolution in vivo, as well as assessing the biocompatibility of single-nanoparticle probes in vivo.
In old movies, saying "the rabbit died," was a popular way for a woman to reveal she was pregnant. The belief was that the doctor would inject the woman's urine into a rabbit. If the rabbit died, she was pregnant. The rabbit test actually originated with the discovery that the urine of a pregnant woman - which contains the hormone Human chorionic gonadotropin (hCG) - would cause corpora hemorrhagica in the ovaries of the rabbit. These swollen masses on the ovaries could only be detected by killing the rabbit in order to exam its ovaries. So, in reality, every rabbit died whether the woman was pregnant or not. Fortunately (for rabbits in particular), immunoassays - which can detect hormones (such as hCG), antibodies and antigens in the blood - were developed in the 1950s. Radioimmunoassays were first used to detect insulin in blood, but were later used for a variety of diagnostic tests. The technique is extremely sensitive and specific, but the necessary radioactive substances make it risky and expensive. In the 1960s, immunoassay technology was greatly enhanced by replacing radioisotopes with enzymes for color generation, which eliminated the risk and a great deal of expense. Today, most immunoassays are Enzyme-Linked ImmunoSorbent Assay, or ELISA. Because it can evaluate the presence of antigen or antibody in a sample, ELISA is commonly used to test for HIV, Hepatitis B, and West Nile Virus. ELISA has also been used in the food industry to detect potential food allergens such as milk, nuts, and eggs. Although there are numerous variations of ELISA, the test basically involves an antigen attached to a solid surface. When the antibody is washed over the surface, it will bind to the antigen. The antibody is then linked to an enzyme - usually a peroxidase (enzyme that causes oxidation) - which reacts with certain substrates, resulting in a change in color that serves a signal. The evolution of immunoassays has continued with developments such as fluorimetric immunoassay (which has replaced the rabbits in pregnancy tests.) Now, scientists at the Chinese Academy of Science have discovered a way to improve the process even more by eliminating one of the steps in certain immunoassays.