Natural disasters such as volcanic eruptions or asteroid impacts may be rare but the first half of 2011 has shown how catastrophic they can be. So what is being done to monitor these behemoths? With the help of powerful computing infrastructures, researchers are developing better tools for just that. ISGTW has identified six promising applications.
There are over 7,000 known near-Earth asteroids. If a one-kilometer diameter meteorite were to hit the Earth, it would not only instantly kill people at the impact site, but hundreds of miles away, said Tim Spahr of the Minor Planet Center in Boston, USA. His team tracks asteroids in the sky 24 hours a day, seven days a week in order to provide an early warning system.
The Minor Planet Center uses data taken from telescopes here on Earth, such as the Steward Observatory in Arizona, USA. They use custom-designed software and an interlinked computing cluster that replicates the parallel computation of a larger grid to process and analyze this information.
To track the millions of asteroids above the Earth, they have eight ‘hyperthreaded’ 2.24 GHz quad-core machines. This means that their software applications - written in FORTRAN - can run in parallel and track the majority of nearby-orbiting celestial objects more accurately.
“We are now in the process of converting everything to more modern software, but that is a very, very time consuming job. Nearly all of our software was custom-written over the years,” said Spahr.
The big test of their system came on 7 October 2008, when they tracked an impending asteroid impact in a remote area of the Nubian Desert in Africa, alerting the US Government to it. The meteor exploded above the ground in an airburst with a force of about one kiloton TNT – luckily, no one was in the vicinity. “I thought we would only get such an event every 20 years ... it may be more frequent than I expected,” said Spahr.
Even though asteroid impacts are rare, Spahr believes that constant vigilance is necessary. “Governments of the world spend money making sure planes are safe and reliable. We keep the Earth safe as well, simply because we are funded to do so. It is a very small budget compared to airline safety budgets, but the amount of money is comparable to the risk involved. Such an event can be predicted if we collect enough data,” he said.
Domenico Vicinanza listens out for volcanic eruptions, literally. He’s a network engineer at Delivery of Advanced Network Technology to Europe (DANTE) organisation in Cambridge, UK, who works on volcano sonification – a process that takes recorded seismic data from volcanic eruptions around the world and converts these into sounds. He hopes to identify patterns within the audio that indicate impending eruptions.
Listen to the Mount Etna volcano here.
Digital seismographs, sampling at a rate of 100Hz, are placed near a volcano’s surface to provide raw seismic recordings that Vicinanza uses to convert into audible sounds using sonification algorithms that require grid technologies.
“The complex sonification and waveform analysis algorithms require powerfulcomputation to generate the audible sounds, to compute the spectra and analyze and compare them,” said Vicinanza.
“Graphs show us how the spectrum of generated sound changes when approaching an eruption. The speed with which the high energy spectral lines appear and their shape can provide precious information about the energy distribution of the vibration, the kind of eruption and its power.”
He then performs analysis to correlate seismic activity with spectral changes in the graphs. Eventually this will provide Vicinanza with alternative ways to predict upcoming eruptions, though as yet there is no systematic real-time monitoring of volcanic activity using this technique.
Scientists at NASA’s Jet Propulsion Laboratory in California, use Japan’s GEONET, the world’s most densely-packed network of GPS receiving stations, to monitor how ocean tsunamis affect the upper atmosphere and ionosphere, said David Galvan, a NASA researcher.
The GPS stations are capable of measuring the number of free electrons in the ionopshere, between the GPS satellite and the ground station. Changes in this electron content are caused by upper atmospheric winds, major storms, acoustic waves (produced by earthquakes), atmospheric gravity waves (produced by tsunamis), and even nuclear explosions.
Galvan and his team use a software called Global Ionosphere Modeling (GIM), that runs on a cluster of Linux-based computers to collect the electron measurements. Once the electron observations are made, the GIM software collects observations from detectors and satellites so that accurate values can be estimated.
Galvan used this system to track the recent Japanese tsunami. “The Japan earthquake caused an acoustic wave in the atmosphere that sped away from the epicenter in all directions, travelling very fast, at about 1,000 metres per second. The tsunami moved about 200 meters per second away from the epicenter and created a gravity wave in the atmosphere travelling the same speed,” said Galvan.
For now, tracking is not done in real time. There was about a three hour delay between the time the data was captured by sensors and then archived by the GEONET network for download. However, “our ability to observe the ionospheric signature of an ocean tsunami using GPS receivers may one day become useful in real-time monitoring of tsunamis,” said Galvan.
For a real-time network, they would need to use a dense array of GPS receivers that output data in real time such as NASA’s Global Differential GPS network. “The more islands or coasts that have these receivers, the more likely we will be able to observe ionospheric disturbances. GPS receivers can potentially observe the tsunami signature from shore even when the tsunami is hundreds of kilometers from the coast,” Galvan said.
During 2009 and 2010, Joshua Wurman led a study into the Verification of the Origins of Rotation in Tornadoes Experiment (Vortex2) that involved hundreds of researchers.
Wurman, president of the Center for Severe Weather Research (CSWR) in Wyoming. USA, has invented a variety of radars for tracking tornadoes, such as Doppler-On-Wheels (DOW), which are trucks outfitted with weather radars. The trucks surround a supercell, before the tornado forms, to capture and analyze information in real time.
“It was by far the largest and most diverse network of instrumentation ever deployed on tornadic storms, with about 50 vehicles crewed by 100 scientists, students and engineers,” said Wurman.
While the Vortex2 project finished last year, around 50 scientists and students still analyze dozens of terabytes of data collected from 50 storms – both tornado and non-tornado. These data are also being used in computerized tornado simulations. These applications are being run on desktops connected to grids, computer clusters and high-performance computers.
This data is being incorporated into research studies to help make forecasting more accurate: “Our goal is to learn how tornadoes form so future forecasters can issue warnings 20, 30, or 40 minutes ahead, with false alarm rates below 50%,” said Wurman.
Before 2003, human surveillance was the first line of defence of forest fires in Croatia. Often, this method was unreliable and by the time a fire was identified a lot of forest was already engulfed by flame.
“In summer seasons seven coastal counties in Croatia and in particular the Adriatic islands are permanently exposed from high to very high fire risks. In 2003, wildfires occurred 130 times. The direct and indirect damage of Split and Dalmatian County was assessed at €16 million and €60 million respectively,” a Croatian engineering team from the University of Split wrote in their research paper.
As there is no smoke without a fire, their solution was the iForestFire system,an automated network with software designed to capture initial signs of fire. Remote-controlled video cameras in key locations record in the visible light and infra-red spectrum.
iForestFire combines this footage with meteorological and historical-geographical data with a system is comprised of a step-by-step process of searching, identifying and escalating forest fire risk. Users can see observed forestry areas with a Web Information System, through a browser, to remotely monitor risks from far away and alert the fire brigade quickly if needed. Or the system can run completely automated, with intelligent software agents responsible for image collection, storage, sensor testing and alarm generation.
They successfully field tested their network in 2005 and 2006 with the Croatian Fire Brigade, setting up three prototype monitoring stations and two operation centers with choreographed fires set in predefined areas. Now, iForestFire is used as a forest fire monitoring system in various Croatian regions and national parks.
Elizabeth Cochran of the University of California, Berkeley, and Carl Christensen of Stanford University manage the Quake Catcher Network, a globally distributed network ofhundreds of micro-electromechanical-systems(MEMs), which can detect the first signs of an earthquake.
“We can relatively cheaply and easily get some sort of network out there monitoring places that may not have such extensive and expensive seismometers in their locale,” said Christensen.
It’s a volunteer system: each MEM on the network contains an accelerometer. They are located in people’s USB sticks, the latest smart phones, modern laptops or Nintendo Wii controllers. MEMs are used by the Quake Catcher Network as basic sensors to identify ground tremors and are essential in early detection of an earthquake’s magnitude and propagation of its seismic waves. With enough sensors in an area, an earthquake early-warning system can be set-up.
“It is a low CPU-distributed sensor network,” said Christensen, meaning that powerful computer processing is unnecessary. Although, each MEM sensor on the Quake Catcher Network is not of the highest quality (10 or 12 bits), what they lack in sensitivity, they make up for in simplicity. With enough MEMs in a given area – for example, 300 – they can function as one large high-resolution seismic detector. At $50 a pop, this also makes them low-cost and scalable to enable the network to grow. Typical research-grade sensors can cost between $10,000 and $100,000.
The Quake Catcher Network has already proved its worth. On 4th September 2010, a 7.1 magnitude earthquake hit Christchurch in New Zealand. The Quake Catcher Network pre-emptively placed sensors in the area before the earthquake struck. The first quake was detected within three-to-five seconds of them receiving the trigger data-packet at their location in Stanford.
The Quake Catcher Network team are now ‘ramping up’. Cochran said they are ordering higher quality MEMs – 16 to 24 bit – which will add to the network’s sensitivity of Earth-born tremors and will be ready in a few months. They are also setting up the Rapid Aftershock Mobilization Program (RAMP) for post-earthquake analysis.