"With this paper, we show the ways, in which we will approach our objective of decoding the brain in the next few years," says Prof. Katrin Amunts, director of the Institute of Neuroscience and Medicine (INM-1) at Forschungszentrum Jülich and since June Chair of the new Science and Infrastructure Board, which is providing the scientific leadership of the Human Brain Project.
The scientists aim to comprehensively investigate the various spatial and temporal levels of brain organization in different experiments. This data will be used to develop models at all levels and test them in simulations, which will in turn help to refine experiments.
"Integrating the findings at all these levels of the brain is the key to an understanding of human brain organisation," says Amunts.
The research activities are divided into eleven subprojects: four of which are primarily dedicated to neuroscientific research, while six others provide hardware and software for experiments, analyses, and simulations.
In addition, one subproject is devoted to ethics and society. Information and communication technology (ICT) is expected to benefit from knowledge about the brain and the nervous system. For example, supercomputers can make use of findings regarding brain function, and improved control systems of robots will be developed.
"Through the new neuroinformatics platform, HBP is also showing itself as a pioneer of modern collaborative research," explains Jülich scientist Prof. Thomas Lippert, head of the High Performance Analytics & Computing Platform of the HBP. "Our cloud technologies not only offer researchers access to analysis and simulation systems, they also provide them with platforms for cooperative software development as well as federated high-performance data systems throughout Europe," says Lippert, who is also director of the Jülich Supercomputing Centre (JSC).
JSC recently put into operation two pilot systems for the HBP that were specifically developed according to the needs of the scientists. They open new perspective to analyse enormous amounts of data using novel tools of machine learning.