Physicists in search of gravitational waves face a tough question: Will they know a gravitational wave when they see it, given that they’ve never seen one before?
One way that they prepare for the day when they detect the real thing is via blind injection tests, in which a fake signal is secretly added to the data in order to test the detector and analysis. The Laser Interferometer Gravitational Wave Observatory (LIGO) recently passed just such a test, with the help of their extensive computing infrastructure.
Gravitational waves are a prediction of Einstein's General Theory of Relativity. These waves have never been directly observed, although there is strong evidence for their existence based on observations of binary pulsars — rapidly spinning neutrons stars which orbit each other. The LIGO Scientific Collaboration (LSC) and the Virgo Collaboration are jointly searching for the first direct observations of gravitational waves using kilometer-scale interferometric detectors. Once these waves are observed, they will open a new window on the universe which is complementary to traditional electromagnetic astronomy.
The LSC and the Virgo Collaboration conducted their latest joint observation run (using the LIGO Hanford, LIGO Livingston, Virgo and GEO 600 detectors) from July 2009 through October 2010. Unbeknownst to the LSC and Virgo teams, a fake gravitational-wave signal was injected into the real data stream. (Such signals are meant to enable an end-to-end test of the LSC and Virgo’s detection capabilities.)
The analysis of this latest science run resulted in a great deal of excitement within the gravitational-wave astronomy community, as their analysis revealed what they believed to be a gravitational-wave signal.
Although it was disappointing to discover that it was a blind injection test, and not a gravitational-wave signal, passing the test by identifying the gravitational-wave signal is an important milestone. The success of the test speaks not only to the quality of the scientific methodology used by the gravitational-wave physicists, the algorithms they have developed, and the very complex workflows that process streams of data, but it also speaks to the computing infrastructure and tools used.
The LSC maintains its own computing infrastructure, the LIGO Data Grid, and also uses the Open Science Grid to perform computations. The grids provide fundamental middleware such as Condor and Globus. However, in order to support complex computations, additional services are needed.
One example of an LSC application is the workflow developed by the Compact Binary Coalescence group, which searches for compact binary inspiral signals. These workflows are quite complex in the number of tasks they must include (over 1.5 million jobs), their dependencies, and the size of the datasets being analyzed (approximately 10 TB). Such workflows require automation, reliability, data management, and portability to make use of the available computing and to provide a reasonable turnaround time.
To complete this task, the LSC and Virgo joint CBC search group are using the Pegasus Workflow Management System, developed at the University of Southern California’s Information Sciences Institute and University of Wisconsin-Madison, to manage the workflows running on LIGO clusters at Caltech, Syracuse University, University of Wisconsin-Milwaukee, the Albert Einstein Institute, and other sites.
LIGO distributes a copy of its instrument data to each partner site so that workflows executing on the resources at those sites have easy access to the input data. Then, Pegasus discovers the necessary data products and feeds them to the computations described in the workflows.
Another issue faced by LIGO-Virgo workflows is the computational granularity of tasks in the workflow. When the runtime of tasks is short, the overheads incurred by the tasks when they are sent to computing resources are comparatively high (these overheads include managing tasks in the workflow execution system, sending tasks to the resource, and the waiting in the scheduling queue at the resource).
The solution? Pegasus is able to automatically cluster tasks into larger entities when generating the executable workflow. As a result, that workflow is smaller, has fewer tasks and dependencies, and these tasks have more computing to do.
In LIGO, as in other projects that deal with data collection, the data may need to be re-calibrated or cleaned more than once if some undesirable artifacts make it into the community data sets. Some parts of the data get “vetoed” and need to be eliminated from the analysis; when that happens, large amounts of redundant work may need to be redone.
Since the corresponding workflows can be very time-consuming, the ability to redo only the affected portions of the workflow is critical. Pegasus provides this capability by registering data as it is being produced by the workflow and then only redoing the computations that were affected by the vetoed data. This capability also supports workflow-level checkpointing.
The overall effect of implementing Pegasus has been a more efficient workflow. This achievement is likely to continue to have value when the next generation of gravitational-wave detectors — Advanced LIGO and Advanced Virgo — come online in the next three to five years. These detectors will be ten times more sensitive than previous generations, and they will come with an increased need for computing efficiency and power.
Based on scientists' current understanding of the abundance of gravitational-wave sources, Advanced LIGO and Virgo will certainly find gravitational waves. The LIGO and VIRGO teams are now more prepared for that moment than ever, thanks to the blind injection challenge. We are looking forward to that day, not only for the achievement, but also because these detections will allow scientists and astronomers to probe the nature of gravity and explore the universe in a completely new way.