Share |

iSGTW Feature - A Decade of Globus in Science


Feature: A Decade of Globus in Science


TeraShake 2 simulation of magnitude 7.7 earthquake, created by scientists at the Southern California Earthquake Center and the San Diego Supercomputer Center.
Simulation: SCEC scientists Kim Olsen, Steven Day, SDSU et al; Yifeng Cui et al, SDSC/UCSDVisualization: Amit Chourasia, SDSC/UCSD

Simulating tens of thousands of possible earthquakes shaking Los Angeles, blood flow
through realistic human arteries, or the effect of radiation treatment on cancerous tumors. Searching for new subatomic particles, predicting severe storms and hurricanes, or studying supernovae observed from many different telescopes.

Over the past ten years, grid computing and the Globus Toolkit have made these scientific research projects – and hundreds more like them – easier, faster, and in some cases possible for the very first time.

The first funding for work on Globus was granted in August 1996 by the U.S. Defense Advanced Research Projects Agency. The project, envisioned to create software to bridge the gap between applications and a distributed resource environment enabled by new high-speed networks, evolved into the Globus Toolkit. This open-source software toolkit has since been used by thousands of engineers and developers to simplify the creation of grid systems and applications.

“One thing that has made Globus fun is that we’ve always had a wide range of applications,” says Ian Foster from the University of Chicago and Argonne National Laboratory, who, with Carl Kesselman from the Information Sciences Institute at the University of Southern California, pioneered the Globus project. “One of the first that I can remember was developed by the Aerospace Corporation, and concerned real-time analysis of data from a weather satellite.”

By 1998, grid computing and the Globus toolkit had advanced enough for scientists to showcase several advanced applications, including real-time reconstruction and visualization of experimental data from the Advanced Photon Source and high-throughput computational chemistry, at that year’s Supercomputing conference. The turn of the millennium saw the start of several grid infrastructure projects – such as the TeraGrid, Grid Physics Network, International Virtual Data Grid Laboratory, Particle Physics Data Grid and European Data Grid – that brought Globus technologies to an ever-expanding pool of scientists.

“Some of the most scientifically significant applications are the least visible,” adds Foster, “such as the data replication system that moves terabytes of data from the LIGO gravitational wave observatory. But the applications that I probably like best are those that yield results of societal benefit, like the Earth System Grid’s delivery of climate simulation data.”

Today, the Globus Toolkit is used by thousands of scientists around the world. Some, such as those from the Southern California Earthquake Center, partner with Globus researchers to test new grid tools and techniques and provide feedback on the ToolkitÂ’s use in scientific research. This work with scientific applications has been a hallmark of the Globus project throughout its 10 years.

“In the past, we haven’t had the computational capability to do full-scale calculations of very large earthquakes,” explains Thomas Jordan, SCEC director. “Our computing needs span all the different types: from capability computing to perform hundreds of thousands of large simulations to data-intensive computing where we manipulate very large data volumes. The Globus software stack provides the cyberinfrastructure that allows us to use machines across the TeraGrid and to manage our complex workflows.”

View a list of Globus contributors and learn more about Globus at the Globus Alliance Web site.

-Katie Yurkewicz, iSGTW


Your rating: None Average: 3 (1 vote)

Comments

Post new comment

By submitting this form, you accept the Mollom privacy policy.