|
Sky map from the AMANDA neutrino telescope showing the 1,112 atmospheric neutrino candidates observed, plotted in coordinates of right ascension and declination. The locally produced atmospheric neutrino background detected to date is quite uniform, without strong sources.?Scale on right reflects excess or deficit from mean background events.
Image courtesy of Andrea Silvestri, AMANDA, UCI? |
As scientists explore the most violent phenomena in the universe, the tiny, almost massless particles called neutrinos are valuable messengers, traveling unimpeded across the vast distances of space.
?By analyzing neutrino data on the TeraGrid we?ve been able to validate a new kind of ?telescope? that gives a dramatic new view of the universe through the window of high-energy neutrinos,? says University of California at Irvine astrophysicist Andrea Silvestri. ?This can help scientists unravel a number of longstanding mysteries about the origin and apparent surplus of the highest-energy cosmic rays.?
AMANDA, the Antarctic Muon and Neutrino Detector Array, uses light detectors embedded in Antarctic ice to see cascades of light-emitting particles occasionally generated as hard-to-detect neutrinos travel through the ice.
But in searching for high-energy neutrinos, the AMANDA telescope is producing a flood of data?15 terabytes in 2003 and more than 30 terabytes per year since. To analyze all this data, Steven Barwick and Silvestri of UCI have turned to the large-scale data and computing capabilities of the TeraGrid.
The researchers stored the raw AMANDA data in an archive at SDSC managed by the Storage Resource Broker. Then, using TeraGrid clusters at SDSC and NCSA, the researchers carried out a two-part analysis, using about one-third of their allocation to reconstruct the neutrino events from the observed data and two-thirds to refine the analysis by generating massive simulated datasets using the statistical Monte Carlo method. The demanding computations have required more than 360,000 CPU hours.
Due to Antarctica?s remoteness, the AMANDA data set collected each year is not delivered until the following year. From two billion events in the data from one year, the researchers were able to tease out the faint signal of some 1,100 neutrinos, validating the telescope and analysis steps. After developing and testing new codes, the physicists are now undertaking the demanding analysis of the combined 80 terabytes of data from the three most recent years available (through 2005), which will require more than 600,000 CPU hours on 512 processors of the TeraGrid cluster at SDSC. Having so far found no extra-terrestrial neutrinos, they have been able to rule out some theoretical models that predict larger neutrino fluxes.
In the future, the research will be extended in the National Science Foundation IceCube project, a much larger one-kilometer cube telescope array that will give a more realistic chance of detecting the elusive extra-terrestrial neutrinos. When it reaches full size in 2011, IceCube will produce 10 times the data?150 terabytes annually?driving the need for even larger TeraGrid data and computational resources.
This article appeared as a 2006 Science Highlight on the TeraGrid Web site.
?
- Paul Tooby San Diego Supercomputer Center
|