The latest news about space exploration and technologies,
astrophysics, cosmology, the universe...
Posted: Jun 19, 2015
Masters of the universe
(Nanowerk News) Imagine having to design a completely automated system that could take all of the live video from all of the hundreds of thousands of cameras monitoring London, and automatically dispatch an ambulance any time any person falls and hurts themselves, anywhere in the city, without any human intervention whatsoever. That is the scale of the problem facing the team designing the software and computing behind the world’s largest radio telescope.
When it becomes operational in 2023, the Square Kilometre Array (SKA) will probe the origins, evolution and expansion of our universe; test one of the world’s most famous scientific theories; and perhaps even answer the greatest mystery of all — are we alone?
Construction on the massive international project, which involves and is funded by 11 different countries and 100 organisations, will start in 2018. When complete, it will be able to map the sky in unprecedented detail — 10,000 times faster and 50 times more sensitively than any existing radio telescope — and detect extremely weak extraterrestrial signals, greatly expanding our ability to search for planets capable of supporting life.
Artist's impression of the SKA, which will be made up of thousands of dishes that operate as one gigantic telescope. (Image: SKA Organisation)
The SKA will be co-located in South Africa and Australia, where radio interference is least and views of our galaxy are best. The instrument itself will be made up of thousands of dishes that can operate as one gigantic telescope or multiple smaller telescopes — a phenomenon known as astronomical interferometery, which was developed in Cambridge by Sir Martin Ryle almost 70 years ago.
“The SKA is one of the major big data challenges in science,” explains Professor Paul Alexander, who leads the Science Data Processor (SDP) consortium, which is responsible for designing all of the software and computing for the telescope. In 2013, the University’s High Performance Computing Service unveiled ‘Wilkes’ — one of the world’s greenest supercomputers with the computing power of 4,000 desktop machines running at once, and a key test-bed for the development of the SKA computing platform.
During its projected 50-year lifespan, the SKA will carry out several experiments to study the nature of the universe. Cambridge researchers will focus on two of these, the first of which will follow hydrogen through billions of years of cosmic time.
“Hydrogen is the raw material from which everything in the universe developed,” says Alexander. “Everything we can see in the universe and everything that we’re made from started out in the form of hydrogen and a small amount of helium. What we want to do is to figure out how that happened.”
The second of the two experiments will look at pulsars — spinning neutron stars that emit short, quick pulses of radiation. Since the radiation is emitted at regular intervals, pulsars also turn out to be extremely accurate natural clocks, and can be used to test our understanding of space, time and gravity, as proposed by Einstein in his general theory of relativity.
By tracking a pulsar as it orbits a black hole, the telescope will be able to examine general relativity to its absolute limits. As the pulsar moves around the black hole, the SKA will follow how the clock behaves in the very strong gravitational field.
“General relativity tells us that massive objects like black holes warp the space–time around them, and what we call gravity is the effect of that warp,” says Alexander. “This experiment will enable us to test our theory of gravity with much greater precision than ever before, and perhaps even show that our current theories need to be changed.”
Although the SKA experiments will tell us much more than we currently know about the nature of the universe, they also present a massive computing challenge. At any one time, the amount of data gathered from the telescope will be equivalent to five times the global internet traffic, and the SKA’s software must process that vast stream of data quickly enough to keep up with what the telescope is doing.
Moreover, the software also needs to grow and adapt along with the project. The first phase of the SKA will be just 10% of the telescope’s total area. Each time the number of dishes on the ground doubles, the computing load will be increased by more than the square of that, meaning that the computing power required for the completed telescope will be more than 100 times what is required for phase one.
“You can always solve a problem by throwing more and more money and computing power at it,” says Alexander. “We have to make it work sensibly as a single system that is completely automated and capable of learning over time what the best way of getting rid of bad data is. At the moment, scientists tend to look at data but we can’t do that with the SKA, because the volumes are just too large.”
The challenges faced by the SKA team echo those faced in many different fields, and so Alexander’s group is working closely with industrial partners such as Intel and NVIDIA, as well as with academic and funding partners including the Universities of Manchester and Oxford, and the Science and Technology Facilities Council. The big data solutions developed by the SKA partners to solve the challenges faced by a massive radio telescope can then be applied across a range of industries.
One of these challenges is how to process data efficiently and affordably, and convert it into images of the sky. The target for the first phase of the project is a 300 ‘petaflop’ computer that uses no more than eight megawatts of power: more than 10 times the performance of the world’s current fastest supercomputer, for the same amount of energy. ‘Flops’ (floating point operations per second) are a standard measure of computing performance, and one petaflop is equivalent to a million billion calculations per second.
“The investment in the software behind the SKA is as much as €50 million,” adds Alexander. “And if our system isn’t able to grow and adapt, we’d be throwing that investment away, which is the same problem as anyone in this area faces. We want the solutions we’re developing for understanding the most massive objects in the universe to be applied to any number of the big data challenges that society will face in the years to come.”