Nanotechnology enabled synthetic biology could one day lead to an artificial construct that operates like a living cell. That day might be a considerable distance off, given the difficulties scientists are still having in even understandiing the organizing principles and workings of a cell, not to mention being able to duplicate cell components and assembling them into a working whole. The large discrepancy between the functional density (i.e., the number of components or interconnection of components per unit volume) of cells and engineered systems highlights the inherent challenges posed by such a task. Just take 'simple' bacteria like Escherichia coli (which has an approx. 2 square micrometer cross-sectional area). The E. coli cell has some 4.6-million base-pair chromosome (the equivalent of a 9.2 megabit memory) that codes for as many as 4,300 different polypeptides under the inducible control of several hundred different promoters. The most advanced silicon chips will be able in a few years time to come close to this performance (on the other hand, you have several trillion E. coli in your gut; you would need to swallow a lot of computer chips to match this combined 'computing' power). Another way to look at the synthetic cell challenge is to regard the cellular environment as a highly complex synthetic medium, in which numerous multistep reactions take place simultaneously with an efficiency and specificity that scientists are not capable of duplicating at this scale. Researchers in The Netherlands have now succeeded in constructing nanoreactors that can be used to perform one-pot multistep reactions - another step towards the goal of artificial cell-like devices, but more promising in the short term for screening and diagnostic applications.
What do humans have in common with the pinky-sized tropical zebrafish that zip around in many hobbyists' home aquariums? Well, surprising as it may be, quite a lot actually. Zebrafish share the same set of genes as humans and have similar drug target sites for treating human diseases. For this reason, scientists, when turning to a model-organism to help answer genetic questions that cannot be easily addressed in humans, often chose the zebrafish (Danio rerio) - and save a few mice in the process. Zebrafish are small, easy to maintain, and well-suited for whole animal studies. Furthermore, its early embryonic development is completed rapidly within five days with well-characterized developmental stages. The embryos are transparent and develop outside of their mothers, permitting direct visual detection of pathological embryonic death, mal-development phenotypes, and study of real-time transport and effects of nanoparticles in vivo. Therefore, zebrafish embryos offer a unique opportunity to investigate the effects of nanoparticles upon intact cellular systems that communicate with each other to orchestrate the events of early embryonic development. In a new study, researchers explore the potential of nanoparticles as in vivo imaging and therapeutic agents and develop an effective and inexpensive in vivo zebrafish model system to screen biocompatibility and toxicity of nanomaterials. Such real-time studies of the transport and biocompatibility of single nanoparticles in the early development of embryos will provide new insights into molecular transport mechanisms and the structure of developing embryos at nanometer spatial resolution in vivo, as well as assessing the biocompatibility of single-nanoparticle probes in vivo.
In old movies, saying "the rabbit died," was a popular way for a woman to reveal she was pregnant. The belief was that the doctor would inject the woman's urine into a rabbit. If the rabbit died, she was pregnant. The rabbit test actually originated with the discovery that the urine of a pregnant woman - which contains the hormone Human chorionic gonadotropin (hCG) - would cause corpora hemorrhagica in the ovaries of the rabbit. These swollen masses on the ovaries could only be detected by killing the rabbit in order to exam its ovaries. So, in reality, every rabbit died whether the woman was pregnant or not. Fortunately (for rabbits in particular), immunoassays - which can detect hormones (such as hCG), antibodies and antigens in the blood - were developed in the 1950s. Radioimmunoassays were first used to detect insulin in blood, but were later used for a variety of diagnostic tests. The technique is extremely sensitive and specific, but the necessary radioactive substances make it risky and expensive. In the 1960s, immunoassay technology was greatly enhanced by replacing radioisotopes with enzymes for color generation, which eliminated the risk and a great deal of expense. Today, most immunoassays are Enzyme-Linked ImmunoSorbent Assay, or ELISA. Because it can evaluate the presence of antigen or antibody in a sample, ELISA is commonly used to test for HIV, Hepatitis B, and West Nile Virus. ELISA has also been used in the food industry to detect potential food allergens such as milk, nuts, and eggs. Although there are numerous variations of ELISA, the test basically involves an antigen attached to a solid surface. When the antibody is washed over the surface, it will bind to the antigen. The antibody is then linked to an enzyme - usually a peroxidase (enzyme that causes oxidation) - which reacts with certain substrates, resulting in a change in color that serves a signal. The evolution of immunoassays has continued with developments such as fluorimetric immunoassay (which has replaced the rabbits in pregnancy tests.) Now, scientists at the Chinese Academy of Science have discovered a way to improve the process even more by eliminating one of the steps in certain immunoassays.
A quantum dot (QD), also called a nanocrystal, is a semiconductor nanostructure that can be as small as 2 to 10 nm. The usefulness of quantum dots comes from their peak emission frequency's extreme sensitivity - quantum mechanical in nature - to both the dot's size and composition. QDs have been touted as possible replacements for organic dyes in the imaging of biological systems, due to their excellent fluorescent properties, good chemical stability, broad excitation ranges and high photobleaching thresholds. By contrast, conventional organic dyes cannot be easily synthesized to emit different colors and have narrow excitation spectra and broad emission spectra that often cross into the red wavelengths, making it difficult to use these dyes for multiplexing. QDs hold increasing potential for cellular imaging both in vitro and in vivo. Researchers have now used QDs for in vivo imaging of embryonic stem cells in mice. This opens up the possibility of using QDs for fast and accurate imaging applications in stem cell therapy.
Medicine is big business. The big pharma companies have traditionally enjoyed enormous profits that would make the eyes of other companies' CEOs water (apart from big oil companies, of course). The combined annual net income for the top 10 pharma companies (ranked by market capitalization) currently is about $73 billion. Pfizer alone has a net income of approximately $19 billion. The recipe for success? Patent protection and intellectual property rights (IPRs). The core of Big Pharma's business model relies on patent protection for their blockbuster drugs, which allows them to sell these drugs at extraordinarily high profit margins that they wouldn't be able to generate in a competitive market. Point in case: Lipitor, the cholesterol-lowering drug that accounts for nearly $13 billion of Pfizer's revenues and over 40% of its profits. Another key part of the pharma business model is heavy spending on sales and marketing. Novartis, for instance, is spending around 33% of sales on promotion, compared with about 19% on R&D, although the cost of bringing a new drug to market could well exceed $1 billion (and that is also the argument pharma companies use to justify their profits). However, pharmaceutical companies are faced with the expiration of the patent protection on their main profit generators, they have relatively few new products in the pipeline, and they need to come to terms with the emerging nanomedicine landscape. While nanomedicine potentially offers promising new value propositions and revenue streams, for instance in diagnostics, it also could completely displace certain classes of drugs such as current chemotherapy agents with novel nanoparticle reformulations. In what looks like more of the same though, it seems that the future of nanomedicine business will also depend on patents and IPRs, potentially even more so than today.
Nature is truly a brilliant nano engineer and has been so for billions of years. There is an abundance of 'smart' biological materials with hierarchical nanostructures - built from proteins - that are capable of adapting to new tasks, are self-healing, and can self-assemble autonomously simply out of a solution of building blocks. The performance and capability of these natural materials is something engineers can only dream of today. But by unlocking nature's secrets tiny step by tiny step, one day we will be able to not only duplicate but surpass the performance of natural materials. Only in recent years have scientists begun to understand the underlying principles and mechanisms of these materials - Why is spider silk stronger than steel? Why can cells be stretched reversibly several times of their original length? What kinds of molecular flaws lead to malfunctions in cells and tissues, as it occurs in Alzheimer's, rapid ageing disease progeria or muscle dystrophies, diseases in which the cell or tissue fails mechanically? Scientists at MIT have, for the first time, revealed the fundamental fracture and deformation mechanisms of biological protein materials, clarifying some long-standing issues about the deformation behavior of cells and Alzheimer's pathogens. The researchers report that the fracture mechanisms of two abundant nanoscopic building blocks of many proteins and protein materials exhibit two distinct fracture modes, depending on the speed of deformation. This is a surprising observation with far-reaching implications for the development of novel self-assembled protein materials and possibly the cure of certain genetic diseases
There is a huge demand for medical implants for almost every body part you can think of. As we have reported here before, the market for medical implant devices in the U.S. alone is estimated to be $23 billion per year and it is expected to grow by about 10% annually for the next few years. Implantable cardioverter defibrillators, cardiac resynchronization therapy devices, pacemakers, tissue and spinal orthopedic implants, hip replacements, phakic intraocular lenses and cosmetic implants will be among the top sellers. Current medical implants, such as orthopedic implants and heart valves, are made of titanium and stainless steel alloys, primarily because they are biocompatible. Unfortunately, in many cases these metal alloys with a life span of 10-15 years may wear out within the lifetime of the patient. With recent advances in industrial synthesis of diamond and diamond-like carbon film bringing prices down significantly, researchers are increasingly experimenting with diamond coatings for medical implants. On the upside, the wear resistance of diamond is dramatically superior to titanium and stainless steel. On the downside, because it attracts coagulating proteins, its blood clotting response is slightly worse than these materials and the possibility has been raised that nanostructured surface features of diamond might abrade tissue. That's not something you necessarily want to have in your artificial knee or hip joints (although some of the currently used implant materials cause problems as well). Researchers have now run simulations that show that thin layers of ice could persist on specially treated diamond coatings at temperatures well above body temperature. The soft and hydrophilic ice multilayers might enable diamond-coated medical devices that reduce abrasion and are highly resistant to protein absorption.
You might have come across the acronym NBIC, which stands for Nanotechnology, Biotechnology, Information technology and new technologies based on Cognitive science. Initially introduced in the U.S. National Science Foundation's 'Converging Technologies for Improving Human Performance' report this acronym is often used to describe the basic idea that scientific and technological innovation can be stimulated through the convergence of two, three, or all four fields. At its most radical (and most controversial), proponents of convergence suggest that nanotechnologies will promote the unification of most branches of science and technology, based on the unity of nature at the nanoscale, including cognitive sciences. We'll keep you posted on this over the next few decades and see how it all works out. For the time being, though, it would be nice to be able to report on something more hands-on and - dare I write it - even practical. As it happens, scientists at the University of Toronto have done exactly that. They have demonstrated, for what appears to be the first time, the convergence of nanotechnology, microtechnology, microfluidics, photonics, signal processing, and proteomics to build a medical device that could lead to the development of fast, portable point-of-care diagnostics for infectious disease (IDs) such as HIV, SARS and many others.