Share |

Five ways to anticipate natural disasters

In 2011, 302 natural disasters were recorded, which claimed over 29,782 lives and caused $366 billion (€295 billion) of economic damages. Today, five promising research projects are using the latest computational methods to predict or forecast – depending on your point of view – disastrous events or their effects.

1) Predicting flash flood uncertainty

Image of 3D model predicting the precipitation that generated the Genoa 4 November 2011 flash flood.

3D very high-resolution (1km) numerical model predicting the precipitation (all liquid or solid aqueous particles that originate in the atmosphere and fall to the Earth's surface) that generated the Genoa 4 November 2011 flash flood. The orange color refers to isovolume for rainwater, the gray to isovolume for graupel (soft hail or snow pellets), the green one to  isovolume for snow, and finally the purple one to isovolume for water vapor. Top image courtesy Nicola Rebora. Main image courtesy Wikimedia Commons.

Rapid flooding due to intense rain can cause flash floods within minutes. Powerful flows of water can trigger further landslides. According to the US National Weather Service, more people die from floods than lightning, tornadoes, and hurricanes. EC-funded researchers of the Distributed Research Infrastructure for Hydro-Meteorology (DRIHM) use distributed computing and citizen scientists to predict flash floods.

DRIHM researchers build models from observations to work out the average rainfall in a given area: 100 square kilometers (39 square miles), for example. Then, they work out the probability of how much rainfall will occur within a ‘catchment zone’ of a few kilometers to identify any flash flood instances.

“Our goal is to develop a user-driven research infrastructure for hydro-meteorologists, using grids and high-performance computers,” said Nicola Rebora of the CIMA research foundation, Italy, and deputy coordinator of DRIHM. 

A problem when trying to predict flash floods is combining various models. “In order to combine meteorological prediction with hydro-meteorological modeling, we need to downscale or increase the resolutions. If we don’t use the right resolution, the flood prediction will be underestimated, which could cost lives,” Rebora said.

Flash floods can be predicted, but with uncertainty. Rebora aims to quantify this uncertainty. Citizen scientists play a crucial role in this process.

“Citizen scientists can help us design requirements of our infrastructure as they will also be its users,” Rebora said. DRIHM researchers work with citizen scientist networks in Italy, and the US, and collaborate through an online user forum. “It’s a mutual benefit; citizens give their data to us, and they get back computational power and knowledge.”

The DRIHM project helped in the Cinque Terre (25 October 2011) and Genoa (4 November 2011) flash floods. Rebora said, “In both cases, a timely alert was issued more than 12 to 24 hours in advance.”

2) Spotting tornadoes through supercells

2011 is ranked as the 4th deadliest tornado year in US history: there were 1,691 tornadoes, which killed hundreds. This year, researchers from the National Oceanic and Atmospheric Administration are currently using the latest computer models and high-performance computers to set-up an early warning system.

Image of a tornado formation simulation.

Image showing near-surface vorticity or instantaneous rotations, pressure, and wind in the same simulation, and at the same time. The large drop in pressure (indicated by black lines) and rise in vorticity (color), confirm a tornado forming in the lower left region of the simulation. Image courtesy Amy McGovern.

The storm-scale ensemble forecast system aims to predict the probability of severe storms every hour over a subsequent one to two day period. This ensemble approach is a set of numerical models that have horizontal grid spacing of one to four kilometers (0.6 to 2.5 miles). They provide a predictability of certain outcomes.

For accurate tornado forecasts, researchers need model resolutions of 50 to 75 meters (164 to 246 feet). Instead of drilling down to this detail, they can understand tornado formation by studying supercell thunderstorms at scales of around 10 kilometers (six miles). These supercells are used as a proxy to predict the probability of tornado formation.

“At these scales, we are not explicitly forecasting tornadoes but are in the range of supercells and other thunderstorm structures that produce severe weather, e.g. large hail and high winds,” said Greg Carbin, a researcher  at NOAA.

To get accurate predictions researchers, such as Amy McGovern from the University of Oklahoma, simulate 100 to 150 storms. She then extracts meta-data about winds blowing upward and downward, and other data, to identify tornado-forming signatures. McGovern runs these high-resolution simulations on software that uses the XSEDE high-performance computing network.

“We look for storm characteristics that generate tornadoes, which don't appear in the storms that don't generate tornadoes, and vice versa. We build a forest of decision trees to make predictions,” McGovern said.

Now, the main challenge is developing methods to allow forecasters to digest and interpret the large amount of data coming. “We need advances in expert systems that help meteorologists separate signal from noise,” Carbin said.

Human forecasters excel at pattern recognition and can be better than computers at this task. “Pattern recognition and ‘gut instinct’ can be critical to making an accurate short-term forecast,” Carbin said. “Until this very human capability can be incorporated into autonomous systems, humans will probably continue to play a significant role in ‘making the call’ on tornado warnings.” 

3) Replicating the behavior of fire

Image of 3D visualization of automobile fire tests.

SmokeView tool visualization of Fire Dynamics Simulator computational results. This image shows automobile fire tests done by Hluchý's institute, which included engine compartment fires and passenger compartment fires. Image courtesy Ladislav Hluchý.

An outbreak of fire in a family home, power plant, or forest can lead to loss of life, and costly environmental and economic damage. For example, in Europe forest fires burn on average 5,000 square kilometers (1,931 square miles) every year.

The Slovakian National Grid Infrastructure have run a six-month virtual team fire simulation project, as part of the European Grid Infrastructure, to share computational resources and to develop more accurate fire and smoke simulations to predict fire's behaviour.

The virtual project brought together the expertise of European fire research, including Slovakian, Spanish, and Portuguese researchers.

When simulating fires in tunnels, for example, they used a distributed computing version of the Fire Dynamics Simulator, a numerical software tool developed by the US National Institute of Standards and Technology.

“We use a tool called SmokeView to visualize the FDS computation results, such as the fire scenario geometry, spread of flames and smoke, temperature curves, and other physical quantities describing fire development,” said Ladislav Hluchý, director of the Institute of Informatics at the Slovak Academy of Sciences.

With access to parallel computing resources, Hluchý and his team created more simulation experiments, decreased the time to simulate fire, and produced more realistic models.

“The time it takes to do a calculation for a 180-meter-long road tunnel sequentially was 377 hours,” said Hluchý. “With a cluster at our institute, it took 33 hours.” These fire behavior results help increase the realism of fire reconstruction scenarios for training firefighters, and policy makers can create legislation to help reduce risks, both natural and man-made.

Hluchý said, “On a national level there is a high level of acceptance of our research results. We were asked by the highest fire authorities to prepare a computer simulation of family house fires for court of law purposes.” The project finished on June 30; a questionnaire has been designed to help define the simulation services needed on EGI for fire simulation experts, from 12 countries. Input has been received from three National Grid Infrastructures in Spain, Portugal, and Slovakia.

4) Forecasting undersea eruptions

Current estimates are that 80% of the planet’s volcanic eruptions happen underwater. Even in the depths of the ocean, a volcanic vent or fissure can spew large amounts of planet-warming CO2, create tsunamis, or generate deafening noise that can harm marine life.

In 2011, Bob Dziak, a marine geologist at NOAA, US, along with a team of researchers, forecasted the eruption of the Axial Seamount, off the Oregan coast. This was the first time that an accurate eruption estimate had been made of an underwater volcano.

Spider crab inspects an ocean-bottom hydrophone as it sits on the seafloor at Axial Seamount.

A spider crab inspects an ocean-bottom hydrophone (OBH) as it sits on the seafloor at Axial Seamount before the 2011 eruption. The OBH is a monitoring instrument designed to detect undersea earthquakes. The chain is connected to flotation above the view of the photo. Image courtesy Bill Chadwick, Oregon State University.

“Our long-term forecast methods are accurate within months to years. Our short-term seismic methods are accurate from within a few days to hours once we see a volcanic event begin,” Dziak said. These techniques are the most accurate technology currently available to the ocean science research community. Another amazing aspect of the story was that even though one of my seafloor hydrophones was buried under 30 cm (11.81 inches) of lava, its electronics remained functioning. It was still able to communicate with the research vessel on the sea surface.” 

To make long-term forecasts, the researchers’ analyzed data collected from autonomous hydrophones located near Axial Seamount with an open-source tool. The analysis tool they used was written using Interactive Data Language, which creates understandable visualizations from complex numerical data. This software helped the researchers identify seafloor pressure measurements that indicated the volcano was inflating. 

The amount of data required for short-term forecasts is around 40 gigabytes for each year of hydrophone data, and long-term forecasts require 400 gigabytes from 10 years of hydrophone and volcanic pressure data.

Their latest studies show that they can identify key signatures about two hours before an eruption. “It is not clear if our observations are unique to Axial Seamount or can be replicated at other undersea volcanoes worldwide,” Dziak said.

Now, planning is underway for the installation of fiber-optic cables to help increase the sensitivity of forecasts. This will be complete in a few years and will provide real-time video, and chemical, temperature, and biological data. Dziak said, “My biggest challenge is securing additional funds to put more seismic and hydrophone sensors on the seamount, and to see how the volcano responds after an eruption for more accurate future forecasts.”

5) Foreseeing earthquake damage

Skarlatoudis performs seismic wave propagation simulations for the broader area of Thessaloniki in Greece. Image courtesy Andreas Skarlatoudis.

According to the Global Earthquake Model foundation, over half a million people died in the last decade due to earthquakes. It’s very hard, if not impossible, to make an accurate short-term earthquake prediction said Andreas Skarlatoudis, a seismologist at the Geophysical Laboratory, University of Thessaloniki, in Greece. Skarlatoudis works with other seismologists and Hellasgrid IT experts to predict areas that will receive high levels of ‘ground tremors’ from earthquakes.

“Deeper knowledge of ground response to an earthquake and its impact will help us build safer buildings,” Skarlatoudis said.

He requires accurate earthquake data such as location, magnitude, type of fault, and a geophysical model of an area. Then, Skarlatoudis uses simulation software, implemented in Fortran 90 programming language, to produce seismograms. The code is run on the grid, which is coordinated by the European Grid Infrastructure. The results are a 3D simulations of earthquake wave movement, which are used to study the effects on ground motion. 

To evaluate his results, Skarlatoudis compared them with data from the 4 July 1978 earthquake, the largest quake to hit the area in decades. The comparison showed that this method can accurately predict the basic characteristics of ground motion in a metropolitan area and areas that will be hit by the strongest tremors. These models could be used to predict urban vulnerabilities in future earthquakes.

“Fortunately since 1978, my city hasn’t suffered a strong earthquake, so I haven’t had the chance to apply my method for a recent real-life scenario,” Skarlatoudis said. “In Greece, this work is the most advanced. I’m always interested in new research collaborations and cooperating with multidisciplinary scientific fields. Among my future plans is the study of deep earthquakes occurring at depths of 60 to 120 km [37 miles to 75 miles] in the Southern Aegean Sea.”

Your rating: None Average: 4.1 (9 votes)

Comments

Post new comment

By submitting this form, you accept the Mollom privacy policy.