Feature - BiG Grid’s big idea
With the EGI Technical Forum coming up next week, iSGTW thought ths would be a good time to learn more about one of the Forum’s sponsors, BiG Grid.
Modern detectors, medical imaging instruments and micro-arrays produce huge volumes of data, far beyond the storage capacities of conventional computing — thus calling for ever-increasing enlargement of infrastructure.
To help solve this problem, the Netherlands-based BiG Grid project is turning to the grid as a place to combine data, analyze it and allow scientists to conduct research in a wide range of disciplines. BiG Grid is a collaborative effort between Nikhef (the National Institute for Sub-atomic Physics), NBIC (Netherlands Bioinformatics Center) and NCF (the National Computing Facilities foundation). “Our goal is to build and roll-out a nationwide, grid-based, e-science infrastructure,” said Arjen Van Rijn, chairman of the BiG Grid executive team.
However, while areas like particle physics make clear demands of the infrastructure, scientists from other research communities are often less familiar with large central facilities.
Therefore, understanding researchers’ shared requirements is crucial.
BiG Grid does so by talking with the users themselves, asking “what they need for their projects to make maximum use of the infrastructure,” said Maurice Bouwhuis, leader of e-science support at SARA, the Netherlands national High Performance Computing and e-Science Support Center — BiG Grid’s principal operational partner.
Such cross-collaborations can offer many real-world benefits. For example, BiG Grid is working with a project tackling the problem of bird strikes on aircrafts (see 13 January ISGTW). Data from a range of different sensors, from large military radar to a small GPS banded on the leg of a bird, are combined on BiG Grid’s infrastructure. The result allows researchers to predict the size and the timing of major bird migrations next to airports, thus ultimately helping air traffic controllers prevent plane crashes.
Despite the name, BiG Grid is now moving beyond just grid computing. The project seeks to be flexible and tailored to the user needs; consequently, part of the hardware infrastructure will be cloud accessible.
A distributed infrastructure
The project’s four central facilities each provide large-scale computing and storage, while also allowing for more specialized work. This arrangement places significant demands on the infrastructure. Therefore, in addition to the central facilities, there are 12 small-scale computing and storage clusters, located within centers for life science research throughout the Netherlands.
To use this infrastructure effectively, it is necessary to balance the demands of network computing power and storage. “Building bridges between scientific communities is one of the main things we do,” says Bouwhuis.
“Programmers from BiG Grid work together with scientists from the domains themselves to make sure that the scientists are able to run their application on the grid without needing great technical expertise. So it’s a multi-tier approach — there’s the BiG Grid scientific programmer, there’s the domain scientific programmer, and at the top there’s the scientist.”
This work is designed to complement rather than replace existing research infra-structures. “There’s a whole ecosystem of facilities based on ICT (information, communication and the technology),” explained Van Rijn. “We want the BiG Grid infrastructure to be part of the wider ICT infrastructure for scientific research.”
He said: “In specific application domains, like accelerator experiments at CERN, the BiG Grid project acts as an extension of scientific research tools, and enables scientists to cross between research infrastructures.”
BiG Grid’s ultimate goal?
“We want to make it so that anyone who has a central set of data that he wants to distribute can use the grid.”
—Emilie Tanke for iSGTW. Excerpted from an article in Projects Magazine by Arjen van Rijn and Maurice Bouwhuis, BiG Grid.