It’s time for Europe to open up access to supercomputers

18 Sep 2013 | News
In the past decade high performance computing has transformed the practice – and the productivity - of science. Now this analytical power must be opened up to industry, to improve decision-making, spur innovation and boost competitiveness

Supercomputers may date back to the 1960s, but it is only recently that their vast processing power has begun to be harnessed by industry and commerce, to design safer cars, build quieter aeroplanes, speed up drug discovery, and subdue the volatility of the financial markets.

The need for powerful computers is growing, says Catherine Rivière chair of PRACE, the Partnership for Advanced Computing in Europe. If we are going to tackle modern challenges - personalised medicines, sustainable energy supply and climate change - and improve industrial processes, we will need "new tools to cope with big data; new working processes to take into account multi-disciplinary problems and teams; and new physical models,” she said.

Access to supercomputers is essential for industrial competitiveness and top-class science, and Europe’s international competitors are well aware of this. “For the US administration, to out-compute is to out-compete,” Rivière noted.

While some sectors have made forays into high performance computing (HPC), Rivière and other experts at a Science|Business debate, “Supercomputers for all: The next frontier for Europe's high performance computing”, held in Brussels earlier this month, say the full potential of the technology has yet to be explored, either by science or industry.

“Scientific research is a trip into the unknown,” said Modesto Orozco, Head of the Molecular Modelling and Bioinformatics Group, at the Institute for Research in Biomedicine, Barcelona. “I don’t have any idea what computational scientists and biologists are going to do in the future, but I’m completely sure they are going to do it together.”

League table

Much of the focus in high performance computing is on status symbol hardware, with the China’s Tianche-2 currently top of the supercomputer league. Developed by the government-run National University of Defence Technology, Tianche-2 is twice as fast as the next computer on the list. The US takes second and third place, followed by Japan, with Germany being Europe’s only representative in the top ten.

The EU has all the necessary know-how and skills to hold its own in the supercomputing race, believes Konstantinos Glinos, Head of the eInfrastructure Unit at DG Connect in the European Commission, “But [Europe] needs to increase its investments in HPC in order to be in a leading position. The challenge of ‘big data' and the transition to 'exascale' computing provide an opportunity,” Glinos said.

However, it will require more than hardware alone to exploit the scientific and economic potential of supercomputers. We don’t just need hard cash, we need to address the full HPC ecosystem, Glinos said. “We need to develop next-generation technology and applications, provide the HPC services that EU industry and scientists need, and ensure availability of computational skills.” One objective of the Commission is to develop centres of excellence in HPC in different application areas, with the first centres due to be launched by 2015.

At present 80 per cent of HPC capacity in Europe is provided by PRACE, an international not-for-profit association backed by 25 countries. Six supercomputers operating in Spain, Italy, Germany and France are accessible to European scientists from academia and industry on the basis of scientific excellence.

“In three years, some real breakthroughs have been realised because of the possibility to have access to big computers,” said Rivière. But with the US, China, and Japan making continuous strides, “It is essential for European countries to keep access to such capacity and to make the best use of HPC in the upcoming years,” she said.

Helping to design and plan

The automotive and aviation sectors are amongst the most practiced users, deploying HPC to improve the design and safety of their products. As a case in point, the French manufacturer Renault recently took part in a PRACE project to evaluate new methods for understanding how vehicles respond to the impact of a crash, in order to improve the safety of its cars. “This was world-first research in the automotive domain,” said Rivière.

A less glamorous, but equally practical application, would be in the area of flight scheduling, said Peter Hotham, Deputy Director, Operations & Programme at the SESAR Joint Undertaking, a public-private partnership between the EU, Eurocontrol (the body that supervises European airspace) and industry.

Hotham said HPC could help in “eliminating or understanding uncertainty”, both through improved weather forecasting and the simulation of alternative scenarios.

For example, during the Icelandic ash cloud saga, the authorities were largely unaware of the density of the ash, at what altitude it was, and what effect it would have on aircraft engines. “Perhaps HPC could have played a very interesting role in processing this data and creating models to assess whether flying was safe at various altitudes, potentially allowing a more rapid recovery from this disruptive event,” Hotham said.

Facing a data deluge

Big data has become the buzzword of 2013, with reports worldwide extolling the benefits for industry, research and public policy. Making sense of mountains of data may be supercomputers’ biggest role yet – and may spur vital investment.

“You can now have your genotype for less than one hundred dollars,” said Orozco. “The courier that is used to send the sample can often be more expensive than the test itself” A huge amount of data is generated through genome sequencing. “Now the problem is to analyse such data; to manage it; to get predictive models and to use them to predict the behaviour of living organisms. And this is why we use computers,” he said.

Elisabetta Vaudano, Principle Scientific Officer at the Innovative Medicines Initiative (IMI), said contributions from HPC experts to the Strategic Research Agenda for IMI 2, a public-private partnership in pharmaceutical research, could be very valuable. IMI 2 will see a push towards personalised medicine, driven by a shift from diagnosing diseases by symptoms, to molecular-based medicine, in which the underlying molecular and genetic characteristics of diseases are identified by objective diagnostics.  “This means handling big data and the challenge of their proper exploitation,” said Vaudano.

Similarly, Gilles Ollier, Head of the Earth Observation Team, at DG Research, said the community is facing “a deluge of observational data from a variety of sources”. With the next generation of satellites set to generate ten or fifteen times more data than their predecessors, the challenge is not only to analyse this information, but to store and access it, he said.

Moving forward

The importance of HPC to the competitiveness of European industry was illustrated by Andrea Beccari, Manager of the Drug Discovery Platform at Italian biopharmaceutical company Dompé. The company recently uncovered a potential new use for an existing drug using HPC.

“We virtually screen all market-ready drugs for new indications,” said Beccari. The screen indicated that a molecule which has been on the market since 1981 for treating one disease, could be a potential new treatment for prostate cancer. “Nobody knew that it also acts on the target of interest,” Beccari said. “This gives us an advantage from a competitive point of view.”

Never miss an update from Science|Business:   Newsletter sign-up