Share |

The next generation of scientific computing

Find out more about the 'IT requirements for the next generation of research infrastructures workshop, here. Image courtesy CERN, © 2013 CERN.

“IT today is woven into the fabric of science and business; it’s an integral part of research, engineering and enterprise,” says Herbert Cornelius of Intel. He was speaking at the ‘IT requirements for the next generation of research infrastructures workshop’, held at CERN on Friday 1 February, 2013. The event brought together leading figures in the IT sector from both the industrial and academic research communities. Together, participants worked on outlining the IT requirements of the next generation of scientific research infrastructures in Europe, as well as discussing procurement plans and building a joint technology roadmap.

CERN was an ideal location to host the event, given the world-leading IT infrastructure which is needed to support the research efforts of the scientists working at the organisation. Sergio Bertolucci, director for research and scientific computing at CERN, heaped praise on the Worldwide LHC Computing Grid for the key role it played in enabling the discovery of the new Higgs-like boson last year. “It has worked well beyond specification... it was fundamental to the success,” he says.

Laurence Field, IT and data management topic leader for the CRISP project, elaborated on the challenges faced by IT infrastructure in facilitating scientific research at large physics research organisations like CERN. “With each decade, data rates increase,” says Field. “As we go towards 2020, we’re looking at data rates of terabytes per second.”

“The complete data set for the previous accelerator here at CERN, the LEP collider, was just a few terabytes, whereas the just first year of data for the Large Hadron Collider was in the order of petabytes,” says Field. “We really have to understand how we can deal with these increasing data rates.”

Of course, increasing experimental data isn’t just a challenge faced by CERN. And nor is it a challenge, unique to particle accelerator facilities. During his presentation, Field also cited the Square Kilometre Array as a big science project which will have to deal with enormous data rates once it comes online at the start of the next decade. The life sciences and other research fields are also seeing similar rates of data explosion, says James Hughes of Huawei. “We are really standing at a crossroads when it comes to storage,” he concludes.

Michael Krisch, project coordinator for CRISP, also spoke at the event about the CRISP project and its goal of “bringing together communities from different research backgrounds to try to create long-term synergies.” Industry has an important role to play in this, says Krisch. Industry may be a supplier of hardware and software, it may be a beneficiary of expertise and technology developed by research infrastructures, and it may also be a user of these research infrastructures, he says.

Bob Jones, the head of CERN openlab, also highlighted the importance of organisations from the public and private sectors working together to tackle the IT challenges the next generation of research infrastructures is likely to create. “We see this as a win-win public-private partnership between research and industry,” says Jones.

“We [at CERN openlab] investigate the state of art technologies which are often not yet available on the market. Essentially, what we’re trying to do is take those future technologies which are not yet commercialised and test them in this environment at CERN. We basically break them — we’re quite good at doing this,” says Jones. “But we tell the companies why the technologies are broken and how they can improve them.

“It’s not only testing, though,” he adds. “By doing this we’re helping the companies to improve their products and services.” Jones goes on to say that CERN essentially acts as an example use case for this state-of-the-art technology: “By using CERN openlab as a showcase, companies can then promote their products and their services to other labs and different business sectors.”

Monica Marinucci, Oracle’s director of research for Europe, Middle East, and Africa, expanded on the benefits of such partnerships from the perspective of industry. “We do this because organisations like the ones we’ve been seeing today are ahead of the game in terms of the challenges and the requirements they have,” she says. “It’s really about pushing the limits of the technology.”

Finally, Konstantinos Glinos, head of the European Comission’s eInfrastructure unit, highlighted the importance of adopting a co-design approach, whereby the research organisations which use the products and services of industry work closely with the companies producing them in the development process and give detailed feedback at regular intervals.  He believes that this approach is key to putting the IT systems in place to support the next generation of research infrastructures. “Both parties benefit from co-design,” says Glinos. “Industry has a product that is much more likely to fit the needs of their users and users have a product that fits their needs.” Glinos also suggested that pre-commercial procurement, a system popular in the US, whereby public research users are able to drive innovation from the demand side by acting as technologically demanding first buyers, could play an important future role.

However, Glinos also echoed the arguments of others participants in saying that the biggest future challenge to overcome is probably the need to make significant improvements in the energy efficiency of IT infrastructures.  This was one of the several main challenges for the community highlighted by Jones during a brief roundup of the day’s workshop. While he acknowledged efforts of the European Commission through its Smart Cities initiative and described the IT industry as being “active” in this research area, he warned that big data could pose some significant challenges, both in terms of storage and processing.

Echoing a point raised by Hervé Mouren, managing director of Teratec, Jones says: “In research, we’re moving from a knowledge-driven model of analysis to a data-driven approach and it will require massive IT resources in order to do this.” Consequently, he asks: “Where’s the electricity going to come from?”

Your rating: None Average: 4 (16 votes)