Posted: January 19, 2007 |
Billion-variable optimization marks new milestone in nanotechnology |
(Nanowerk News) A paper published in the
journal Complexity ("Toward routine billion-variable optimization using genetic algorithms") describes how a team of researchers in the Illinois
Genetic Algorithms Laboratory (IlliGAL) at the University of Illinois at
Urbana-Champaign (UIUC) has achieved efficient, scalable solutions on
difficult optimization problems containing over a billion variables.
|
The
team led by noted researcher and author David E. Goldberg used
specially programmed genetic algorithms (GAs)—search procedures
based on natural selection and genetics—to achieve the feat, together
with theories of scalability and implementation techniques developed
at Illinois. Optimization uses mathematics and computation to find
efficient, effective solutions to problems in science, technology, and
commerce, and it is widely used in scheduling, engineering design, and
business management.
|
Procedures in common use today are limited
to thousands, sometimes millions, of variables because the most
powerful methods become prohibitively expensive as the size of the
problem increases. The Illinois result proves that billion-variable
problems can be solved effectively and practically on existing
computers with known procedures.
|
The calculations were performed on subsets of the 1536-
processor Turing cluster housed in UIUC’s Computational Science and
Engineering (CSE) program.
|
CSE director, Michael Heath, greeted the accomplishment. “This is exactly the kind of paradigm-breaking
computational result that we hoped to enable in creating the Turing
cluster.”
|
UIUC material scientist, Duane Johnson suggested that the
result “is a milestone in the developing world of nanotechnology,
enabling the analysis and design of new molecules in ways that were
not previously possible,” and John Deere emerging technology guru
Bill Fulkerson sees the results as heralding a new day of complex
systems optimization more generally. “Gone are the days of using a
toy GA to solve a toy problem. With petascale computing and solvers
like this, complex systems optimization becomes possible.”
|
Other team members included Kumara Sastry, a PhD candidate
in Industrial and Enterprise Systems Engineering and Xavier Llora, a
machine learning researcher at the National Center for
Supercomputing Applications (NCSA).
|
Although the team is pleased
with the billion-variable result, it is not resting on its laurels. Sastry
put it this way: “One reason this result is so interesting is because it is
so general. With most optimization procedures you are stuck solving a
limited class of problems. This result is immediately useful to a broad
array of problems, and existing theory and technique tells us how to
speed results on larger, harder problems that would otherwise be
prohibitively expensive or impossible.”
|
Goldberg is excited by the
array of existing application areas that can benefit from the result.
“Genetic algorithms have been used regularly for two decades across
the spectrum of human endeavor. Science, engineering, commerce,
and even the humanities and the arts have already benefited from
myriad applications of genetic algorithms. The billion-variable result
can be put to use immediately across the panoply of existing and yetto-
be-imagined application domains.”
|
Complexity editor-in-chief,
Alfred Hübler welcomed the research as “spectacular.” “Goldberg’s
team has achieved something special. This result advances complexity
science and technology immediately and noticeably.”
|