Share |

Content about Americas

April 4, 2007

Feature - NYS Grid: Spreading the WordImages of a cholestosome model, generated with NYS Grid resources.Image courtesy of Mary McCourt, Niagara UniversityLast summer a collection of universities and labs in New York State gathered with the goal of creating a cyberinfrastructure initiative to make it easy for regional scientists to manage data and use imaging software. This seven-month-old cyberinfrastructure, with New York State Grid as its foundation, may be in its infancy but it is growing, with lofty goals. “We are trying hard to get users in New York state to take the plunge and use the Grid,” says Russ Miller, professor at SUNY-Buffalo and executive director of the cyberinfrastructure initiative. NYS Grid uses the Open Science Grid software stack; Miller and his team are developing middleware to run on top of OSG software to support applications in biomedical computing, molecular structure studies and structural biology. Mary McCourt, Niagara University’s chemistry department chairperson, uses NYS Grid in her

April 4, 2007

Feature - Open Science Grid Workshop in Argentina Participants in a recent Open Science Grid workshop held in Argentina. Image courtesy of Carolina Leon Carri, University of Buenos AiresFor some Latin American students, a recent workshop held in Santa Fe, Argentina, may have been an important step towards a lifetime of working with grids. The goal of the “Hands-on Workshop in Grid Computing,” presented by educators at the Open Science Grid and the University of Buenos Aires, March 12–14, was to give young scientists, with almost no background in distributed computing, the ability to use grids in their research. The workshop, part of a two-week-long information technology school organized by the National University of the Littoral in Santa Fe, hosted 20 students from universities all over Argentina.“The students were interested and very capable,” said Ben Clifford, OSG science education specialist, about his trainees. “The group included a nice spectrum, from people interested in applications to t

March 28, 2007

Feature - OSG’s Grid Operations Center: Ready to HelpIn the world of grids, people in grid operations are the equivalent of firefighters.Stock image from www.sxc.huLet’s say you are an Open Science Grid user. At 3:20 in the morning, struck by a brilliant thought, you decide to submit a job to the OSG grid, but don't know which site to run your calculations on.  A 24-hour monitoring service run by OSG’s Grid Operations Center solves the problem, helping users decide where to send their jobs by showing sites which are available. At 3:26 a.m. you come across another problem: you are having trouble getting a proxy. The Grid Operations Center is ready to help you again. “Users need a central place to go to for simple trouble shooting,” says Rob Quick, Senior Analyst and Programmer at the Grid Operations Center, who has worked there for four years. “At GOC users have 24/7 phone and e-mail support.”Problems can often be quickly fixed by the GOC. For complicated problems, GOC staffers will

February 28, 2007

Feature: Worldwide Grids, Worldwide ScienceSan Francisco hosted the 2007 meeting of the American Association for the Advancement of Science.  Scientists at last week’s meeting of the American Association for the Advancement of Science in San Francisco discussed the use of grid technologies and volunteer computing to fight disease, predict earthquake effects and hazardous weather conditions, understand the origins of the universe, and decode our own behavior.“Science is distributed because life is distributed,” said TeraGrid Director Charlie Catlett in one of three sessions devoted to grid computing. “The Internet is worldwide, people are putting computers and storage facilities on the Net. The question is not whether you want a distributed infrastructure, but whether you want to use that infrastructure to do science.”During the grid-focused sessions attendees learned about the global growth of distributed computing infrastructures and their use by scientists around the world.    &nbs

February 28, 2007

Link of the Week - Structure of a lipid bilayer computed by nanoGromacs, of nanoHUB.orgCourtesy of offers free online simulation tools and educational material for teaching and research in nanotechnology. Users can choose from almost 50 simulation tools, allowing them to simulate solar cells, networks of carbon nanotubes and nano-transistors, among many other structures. The tools appear to run as applets in a browser window, but they are powered by a remote HUB infrastructure. No software is installed or uploaded to a users computer. In order to run simulations, you must log in with a nanoHUB username and password.  Initial testing is currently under way to enable end users to execute the most processing-intensive tools to both Open Science Grid and TeraGrid resources.  In addition to online simulation tools, nanoHUB also hosts online classes and seminars on subjects including quantum transport, nanoelectronics and nanophotonics.nanoHUB is supported by the National Science Fo

February 21, 2007

Feature - Making the Earth Move  The warm colors leading from the fault region into the Los Angeles area illustrate how a chain of sedimentary basins can guide earthquake waves away from the fault where they originate into heavily populated regions in Southern California.Image courtesy of Amit Chourasia, San Diego Supercomputing Center. An ambitious group of more than 40 institutions, together called the Southern California Earthquake Center, is building earthquake modeling capabilities to transform seismology into a predictive science similar to weather forecasting. To bring that vision to life, SCEC has built a set of grid-based scientific workflow tools. A series of simulations based on these tools—TeraShake 1, TeraShake 2, and the most recent CyberShake—began in 2004. They've run on TeraGrid resources across the country and are already yielding significant results.TeraShake 2, for example, simulated a series of earthquakes along the San Andreas Fault. Run in concert at NCSA and SDSC, it revealed a striking

February 14, 2007

Feature: LEADing Weather Research This spring LEAD will launch fine-scale forecasts automatically in response to tornado-watch conditions. This experiment, the first of its kind, will cover the southern Great Plains during tornado season.  Severe storms in the United States take the lives of hundreds of people and cause more than 13 billion dollars in damage every year. Researchers who seek to improve storm forecasting, and to lessen some of the social cost inflicted annually, are hampered by the complexity of their task. The atmospheric models and data collection tools used today run essentially independent of weather conditions, they do not respond to rapid changes as they occur.“Could we do a better job forecasting and understanding the weather if we adapted to the weather as it evolves?” asks Kelvin Droegemeier, the project director of Linked Environments for Atmospheric Discovery, which explores that very question. Currently, weather models run in a static mode—independent of changes in temperature, wind

February 14, 2007

Feature: Unlocking Secrets of the HeartA volumetric heart model created from a surface model received from New York University.Image courtesy of the Computational Visualization Center, University of Texas at Austin Mathematical models of hearts may not be the most romantic of things, but to patients at risk for vascular disease they hold much more promise than a box of chocolates. Scientists from the University of Texas at Austin combine images of people’s hearts and arteries with mathematical modeling techniques to simulate the interaction between blood flow and the walls of the heart and arteries. The goal of the project is to help physicians better predict the onset of vascular disease and evaluate treatment plans for affected patients.“Our work is focused on the development of patient specific models. We use material- and flow-related data from the literature, but by imaging a person, we capture his or her geometry,” says Victor Calo, a postdoctoral fellow in the team from the Institute for Computational Engin

February 7, 2007

Feature: Geneticists’ Gateway to the GridCircular representation of the Shewanella oneidensis genome.Adapted by permission from MacMillan Publishers Ltd: Nature Biotechnology 2002 Nov; 20(11):1118-23, copyright 2002.In many biological disciplines, and particularly in the field of genetics, the answers to scientists’ questions are buried under mountains of complex data. In genome sequencing—ordering the billions of chemical building blocks that make up the genetic code of a cell—a newly discovered genome is compared to vast databases of well-known and publicly available genomes. Comparing new genomes to huge, ever-growing databases has become a task that is larger than one computer, even a supercomputer, can handle. Grid computing has stepped up to meet the challenge, with the help of the Genome Analysis Database Update tool.  GADU, which creates workflows, runs them on Open Science Grid and TeraGrid, and stores the output, is a backend for applications used by geneticists for tasks ranging from biomedica

February 7, 2007

Feature: San Diego Supercomputer Experts HelpNavajos Build “An Internet to the Hogan” Leonard Tsosie (center), a Navajo senator and a leader of the Internet to the Hogan Project, uses a laptop to explain the project's benefits to friends at their traditional dwelling, or Hogan. Supercomputer experts from UC San Diego will help end the "digital divide" for many in the Navajo Nation in the Southwest.  Image courtesy of SDSC.Navajos in the American Southwest, many of whom have never had access to a personal telephone, will soon make a significant leap into the Internet Age, thanks in part to resources and expertise provided by the San Diego Supercomputer Center at the University of California, San Diego.The Navajos, who refer to themselves as the “Dine” (dee-nay), celebrated “An Internet to the Hogan and Dine Grid Event” on Monday, January 29, at Navajo Technical College in Crownpoint, New Mexico. Highlights of the event include their official acceptance of a “Little Fe&rdquo

February 7, 2007

Image of the Week: The First Sources of LightCourtesy of Texas Advanced Computing Center.Understanding the nature of the first stars and galaxies, which formed a few hundred million years after the Big Bang, is at the frontier of modern cosmology. They lie just above the horizon of what is currently observable. NASA is preparing to launch the James Webb Space Telescope to replace the Hubble. This new telescope will be able to observe these early stars and galaxies. In preparing for this key upcoming mission it is important to predict the properties of the first sources of light. Volker Bromm’s astronomy research group at The University of Texas at Austin simulates these first stars using TeraGrid computers at the Texas Advanced Computing Center. This picture, produced by Paul Navratil of the Visualization and Data Analysis group at TACC using additional TeraGrid resources, shows how one of the first stars creates a bubble of high-energy, ionizing photons, thereby beginning the process of transforming the primordial universe in

January 31, 2007

Feature: Caltech and TeraGrid See the Big Picture    The Samuel Oschin telescope at the Palomar Observatory in California.Courtesy of Palomar Observatory. The Griffith Observatory in Los Angeles recently unveiled a spectacular new display during its grand re-opening. The “Big Picture,” an image of the Virgo Galaxy Cluster 150 feet wide and 18 feet high, recorded for posterity on porcelain-enameled steel tiles, was created using data from the Palomar-QUEST sky surveys that were processed using TeraGrid resources.“The observatory came up with this idea five or six years ago, that a single image of the sky would cover the main wall of their new exhibit hall,” says Caltech astronomer George Djorgovski. “Palomar-QUEST surveys 1.25% of the entire sky each night, and covers the same areas multiple times. By the end of the survey we’ll cover 15,000 square degrees, or 40% of the entire sky.” Even the Big Picture, huge by

January 24, 2007

Feature: The Fuel Cell Cometh Possible reaction pathway for the oxygen reduction reaction on a catalytic surface. First the oxygen molecule dissociates, then there are two successive proton additions. The first forms hydroxyl, and the second forms water. Image of courtesy of Manos Mavrikakis, University of Wisconsin, Madison. Hydrogen often wears the black hat when we talk about the prospect of everyday fuel cell use. It is difficult to transport and store. Alternative means of getting it generate their own problems. And the hydrogen they produce is less than pure and thus less efficient. But oxygen, which is the other necessary reactant in many of today's fuel cell designs, is something of a villain in its own right. Splitting oxygen molecules into oxygen atoms and the subsequent formation of water is currently the rate-limiting step, the reaction that restricts overall p

December 20, 2006

Feature: In Search of the Subtle and RareCDF Detector rolling out of the collision hall after discovering the top quark.Image Courtesy FermilabThree trillion times per second — that's how fast quarks in the B sub s (Bs) particle “oscillate,” or switch between their matter and antimatter states, according to scientists from the Collider Detector at Fermilab collaboration. The CDF physicists measured this rapid oscillation with the help of the world's most powerful particle accelerator, Fermilab’s Tevatron, unprecedented computing power made available through the Open Science Grid and the LHC Computing Grid, and a healthy dose of ingenuity. “Bs oscillation is a very subtle and rapid effect,” says Jacobo Konigsberg from the University of Florida, co-spokesperson for the CDF collaboration. “It's astonishing that we can measure it at all.”Astonishing as it may be, CDF routinely measures phenomena both subtle and rare. Recently, CDF researchers caught the rare and elusive Sigma s

December 20, 2006

Link of the Week: Genome Comparison on the World Community Grid

Image credit World Community Grid

The World Community Grid, which uses public computing resources to tackle projects that benefit humanity, launched its first cooperative project with South American scientists in November. The Genome Comparison project, a collaboration between World Community Grid and the Oswaldo Cruz Institute, Fiocruz, Brazil, will compare genomic information to improve the quality and interpretation of biological data and our understanding of biological systems, host-pathogen and environmental interactions. This information can play a critical role in the development of better drugs and vaccines and improved diagnostic procedures.

December 13, 2006

Feature: A Fair Shake for Seismologists

Instantaneous surface velocity 75 seconds after the earthquake origin time. The warm colors leading from the fault region into the Los Angeles area illustrate how a chain of sedimentary basins can guide earthquake waves away from the fault where they originate into heavily populated regions in Southern California.

In the Midwestern United States, in the spring, there are weeks when you can't get through an episode of your favorite television show without an alert telling you that your county is under a tornado watch or, more invasive still, the local meteorologist interrupting programming to tell you to head to your basement. It saves lives, and it represents an incredible amount of simulation and data collection — even if you do have to scurry to the Web later to find out how “Lost” ended that week.
There may never be an equivalent for temblors, a local “earthquake man” breaki

December 13, 2006

Feature: Bringing Arts and Humanities into the Grid Image from a virtual gallery created with the CITRIS Collaborative Gallery Builder, a project associated with HASTAC. Image courtesy CITRIS and the University of California Since 2003, the HASTAC consortium has worked toward the novel objective of developing software and hardware solutions for the worlds of the arts and humanities. HASTAC — which stands for Humanities, Arts, Science and Technology Advanced Collaboratory — also advocates the inclusion of thought on the social, ethical and access issues of technology in parallel with its creation. “The idea was for humanities professors to look at ways to incorporate technologies like grid computing into their research,” says HASTAC Project Leader Jonathan Tarr. “They needed to save humanities from becoming a group of scholars who only work on physical text and weren’t going a

December 13, 2006

Image of the Week: Using WestGrid for Earth Sciences Composite seismic cross-section of an area near southern Vancouver Island. Image courtesy Andrew Calvert, Simon Fraser University Andrew Calvert, a professor of earth sciences at Simon Fraser University in Vancouver, uses the computing power of WestGrid for his seismogram modeling and seismogram simulations, and to process large volumes of imaging data. WestGrid, the Western Canada Research Grid, is a $50 million project to operate a high performance computing, collaboration and visualization infrastructure across western Canada. WestGrid encompasses 14 partner institutions across four provinces and was the first provider in Canada to adopt a grid-enabled system for its resources. This image shows a composite seismic cross-section across the Cascadia forearc near southern Vancouver Island, superimposed on a display of P wave velocities and relocated earthquakes. 

December 6, 2006

Feature: An Immense Database of Indispensable Materials

Example of a zeolite crystal.

Without zeolites your car wouldn't run; every molecule of gasoline burned in your car was refined using these crystalline microporous materials. Gas prices what they are, you might have a love-hate relationship with your car, but consider this: Without zeolites, the big-brains of this world would have to come up with new ways to produce everything from medical-grade oxygen to laundry detergent to asphalt.
Because zeolites are so important to industry, and because new ones with novel properties are in constant demand, researchers would greatly benefit from a database of hypothetical zeolite structures. This database would show designers of industrial applications and chemicals possible zeolites that are thermodynamically accessible, and that might hold promising structural and functional properties.
Michael Deem and David Ea

November 29, 2006

Feature: The Portal Provider
Eric Roberts

Eric Roberts knows portals. As a staff member at the Texas Advanced Computing Center, he has spent the last five years developing, testing, redeveloping and refining them – first as a member and project manager of the GridPort toolkit team, and now as lead developer for the TeraGrid User Portal.
“We launched the TeraGrid User Portal on May 15, and since then things have gotten very busy for us,” says Roberts.
The aim of a grid portal is to simplify the use of grids for everyday users. TodayÂ’s TeraGrid User Portal provides a repository of information for users, including a system monitor for all TeraGrid resources, information on data collections and documentation. It also provides account management functions, allowing users to see their accounts on different resources, and to manage users and accounts for their projects. The current functionality, however, is only the foundation for what

November 22, 2006

Feature: A Decade of Globus in Science
TeraShake 2 simulation of magnitude 7.7 earthquake, created by scientists at the Southern California Earthquake Center and the San Diego Supercomputer Center. Simulation: SCEC scientists Kim Olsen, Steven Day, SDSU et al; Yifeng Cui et al, SDSC/UCSDVisualization: Amit Chourasia, SDSC/UCSD
Simulating tens of thousands of possible earthquakes shaking Los Angeles, blood flow
through realistic human arteries, or the effect of radiation treatment on cancerous tumors. Searching for new subatomic particles, predicting severe storms and hurricanes, or studying supernovae observed from many different telescopes.
Over the past ten years, grid computing and the Globus Toolkit have made these scientific research projects – and hundreds more like them – easier, faster, and in some cases possible for the very first time.
The first funding for work on Globus was granted in August 1996 by the U.S. Defens

November 16, 2006

Image of the Week: Spallation Neutron Source The Spallation Neutron Source accumulator ring. (Courtesy of Oak Ridge National Laboratory managed for the U.S. Dept. of Energy by UT-Battelle, LLC) Neutron scattering is used by many scientific disciplines to determine the structure and dynamics of matter. The $1.4 billion Spallation Neutron Source was completed at Oak Ridge National Laboratory in May of this year. When it reaches full power, the SNS will provide the most intense pulsed neutron beams in the world for scientific research and industrial development using neutron scattering. With more intense beams and new experimental technologies comes an explosion of neutron scattering data. With the first experiments set to be carried out at the SNS by the end of 2006, researchers are developing a Neutron Science Instrument Gateway to the TeraGrid. This gateway will include tools for analysis, visualization, and instrument simulation, as well as data hosting,