Scientists building computer simulations of real-world processes often face the problem that nature operates at huge scales and minute scales at the same time. Even with today’s increased computer power, simulating large-scale phenomena at the finest level of detail can be costly and slow. One solution could be to treat these phenomena as being ‘multi-scale’ – simulating them on a number of scales separately – and then ‘glueing’ each part together.
“Nature is multi-scale. We’re living in a multi-scale world, which means physical processes are different at different scales,” said Krzysztof Kurowski, a researcher at the Poznan Supercomputing and Networking Center in Poland.
Researchers in scientific computation have been trying to do multi-scale analysis for almost 15 years. Today, the Multiscale Applications on European E-infrastructures (MAPPER) project, which is funded by the European Commission FP7, enables automatic and seamless integration of natural phenomena simulations for researchers.
“There were really good chemical physics papers that explained how to do multi-scale simulations; separately, there were other really good biology papers, or engineering papers, that did likewise,” said Alfons Hoekstra, project coordinator for MAPPER from the University of Amsterdam.
Hoekstra said that stepping back to see the big picture holds the key. “The challenge is basically always the same – having to couple the different processes, each happening on different size and time scales.”
“Think about crack propagation in materials science: on one scale it’s a structural mechanics problem. Near the crack itself, you want to look at the molecular dynamics. But, the actual breaking that makes the crack goes down to chemical bonds – that’s quantum mechanics,” Hoekstra said.
Solving quantum mechanics calculations is computationally intensive – typically requiring high-performance computers.
“The difficulty is that high-performance computer time is limited. Not every part of your simulation will require or even scale to high-performance computers, although there will always be some elements of your model that require them,” Hoekstra said. “Other elements will happily run on a smaller cluster, or even a desktop machine. So this basically cries out for a new paradigm, which is what we at MAPPER call ‘distributed multi-scale computing’.”
MAPPER is a software application that breaks up the task of simulating multi-scale processes into several single-scale problems and decides which type of system is best for the job. It then farms out the processing to the appropriate system, whether that’s a high-performance computer or a network of desktop machines, resulting in a more efficient approach to multi-scale computing.
“Computationally, you need a way to glue the scales together, which is what we’re doing at MAPPER,” Hoekstra said.
An early success of MAPPER has been accurately simulating blood flow in the stented arteries of atherosclerosis patients. A stent is an artificial tube inserted into arteries narrowed by the disease, caused by a build up of cholesterol. All stented arteries undergo some narrowing post-surgery, but some patients experience excess scar tissue development. Their arteries become completely blocked again, requiring further surgery.
By looking at the biology of the individual patient, the structural mechanics of their artery walls, and the fluid dynamics of their blood flow, MAPPER is helping to better understand the complications that occur and predict which patients are likely to develop scar tissue. This helps doctors to decide whether to prescribe drugs that reduce scar development, or explore the use of stents made from newer materials in just those patients who experience these difficulties.
However, the project hasn’t been without its challenges as it tries to provide simultaneous access to incompatible e-infrastructures to researchers.
“It’s important to emphasize the fact that existing e-infrastructures in Europe, such as the European Grid Infrastructure and the Partnership for Advanced Computing in Europe (PRACE) have slightly different services and tools available for developers and end users,” Kurowski said.
The project has successfully tested the system in five different scientific domains, from hydrology to nuclear fusion, since the project started almost two years ago.
“Trying to seamlessly provide simultaneous access is difficult both technologically and due to policy differences,” Kurowski said, “[…] but we’re also delivering new capabilities so that simulations can be run in a transparent way.”
MAPPER is currently working with both EGI and PRACE – the two largest e-infrastructure providers – to deliver seamless access.
“MAPPER really is driven by the applications; the infrastructure comes after,” Hoekstra said. “We’re providing not only the tools, but also the services needed to enable European scientists to model and understand these complex phenomena.”