Share |

Content about Americas

October 21, 2009

 

Image of the week - Molecules CHARMM their way through

Simulation performed on the NGS of a drug permeating through a membrane. Image courtesy Brian Cheney, University of Southampton

You are looking at a computer simulation of how a drug permeates through a membrane.
The permeability of a molecule depends upon numerous physical and chemical properties. Brian Cheney and Jonathan Essex, researchers at Southampton University, UK, are investigating these properties using a modified version of the molecular dynamics software package CHARMM (Chemistry at HARvard Macromolecular Mechanics), in an attempt to quantify and estimate the permeability of molecules.
This is valuable for drug-development, because in order for a drug to be successful its active molecules must be easily absorbed through the membrane of the epithelial gastrointestinal tract.
The simulations would take several years on desktop computer for each drug studied, so the researchers rely on the processing powe

October 21, 2009

 

Link of the week - Bid for your grid

GridEcon has developed a marketplace for the grid. Image courtesy Neil Gould, stock.xchng

Ever since grid computing was conceived, there have been visions of one colossal standard grid which is open and simple for anyone to use, just as you can plug into an electrical power grid to get as much electricity as you need on demand.
The dream of making grid computing a universal on-demand utility has yet to be realized. But what if instead of being like a power grid, grid computing resources were available through an open market?
The GridEcon project has developed a platform which turns grid computing resources into commodities which can be bought and sold like stock.
Because the ideology behind grids when they were first established was to enable computing resources to be shared for free, a market system for sharing computer resources seems to go against this ethic. However, the rising popularity of clouds — once described as &ld

October 14, 2009

Feature - A new test bed for future cyberinfrastructure

Image courtesy of jaylopez at stock.xchng.

Grid, cluster, and cloud developers will have somewhere new to test their software before letting it loose on the world, thanks to a new initiative called FutureGrid.
“I think people found that it was pretty hard to test early grid software on the machines that were available, because the machines that were available didn’t like being experimented on,” said Geoffrey Fox, principle investigator for FutureGrid. “FutureGrid is trying to support the development of new applications and new system software, which are both rapidly changing.”
The FutureGrid collaboration, which will be headquartered at Indiana University, had its first all-hands meeting 2-3 October.
“We will have early users throughout the first year,” said Fox. A small number of users are already signed up, but there remains room for more on the FutureGrid roster.
“We would like

October 14, 2009

Feature - Putting Linux on the grid

Popular middleware flavours are now included as part of the standard selection box for Debian and Fedora users. Image courtesy Karen Andrews, stock.xchng

In the field of grid computing, Globus has long been a major brand. One of the earliest grid middleware solutions, the Globus Tookit is not only a popular middleware flavor, but it also offers important building blocks for many other grid solutions, including the ARC middleware produced by the KnowARC project.
Now, KnowARC has brought Globus and VOMS (The Virtual Organization Membership Service) to the Debian and Fedora Linux distributions. These packages are also available in Ubuntu, as they take packages from Debian automatically. Furthermore, they are also in EPEL (Extra Packages for Enterprise Linux), an add-on repository for RedHat Enterprise Linux and derivatives such as CentOS and Scientific Linux that are maintained by Fedora.
The ARC middleware relies on a number of Globus libraries in thei

October 14, 2009

Feature - Supercomputing code helps develop new solar cells

Image courtesy of Patrick Moore.

If scientists could use simulations to zoom in on the atomic level of solar cells, the insight they gain could launch solar power into the next energy orbital.
Unfortunately, those simulations would require an exorbitant amount of computational power.
“Typically we need to simulate tens of thousands of atoms,” said Lin-Wang Wang, a scientist at Lawrence Berkeley National Laboratory. “For the conventional code, if the number of atoms increases by a factor of ten, the computational load increases by a factor of a thousand.”
In fact, the same problem arises with nano-scale simulations of a wide variety of materials. That’s why Wang and his research team came up with the LS3DF code.
“We were thinking about how to improve the algorithm and have linear scaling,” said Wang. When an algorithm scales linearly, the computational cost increases at the same rate

October 7, 2009

Feature - An unexpected bounty of Near Earth Objects

Image of a near-earth object detected by the Sloan Digital Sky Survey. The blue, red and green streaks show the object as it moves through three of the five SDSS filters over a period of five minutes. The two white objects are distant stars. Image courtesy Stephen Kent.

While scanning through images from the Sloan Digital Sky Survey, Fermi National Accelerator Laboratory researcher Stephen Kent noticed something unusual — a few extended streaks scattered among the millions of point-like stars and galaxies.
Kent realized the streaks were produced by Near Earth Objects (NEOs), asteroids or extinct comets whose orbits bring them close to Earth — close enough that they could collide. They appear as streaks because the closer an object is to Earth, the more quickly it moves across our sky. That’s why the patterns of distant stars appear unchanged over the course of our lifetimes, whereas our closest neighborin

September 30, 2009

Announcement - Eighth International Conference on Creating, Connecting and Collaborating through Computing now accepting papers

Image courtesy of C5

The paper submission deadline for the Eighth International Conference on Creating, Connecting and Collaborating through Computing (C5 2010) is 23 October 2009.
This year the conference will take place 25-28 January 2010 in La Jolla, California, a suburb of San Diego. According to their website, “C5 is an international forum for presenting ongoing work as well as new work currently under development and for discussing future needs and directions in creative computing and multimedia authoring environments. We welcome equally the submission of theoretical and technical papers, practitioner/experience reports and papers that bridge the gap between theory and practice.”
Topics of interest include, but are not limited to:

Collaboration & Communication
Technology-Human Interaction
Visualization
Virtual Worlds
Social Network

September 30, 2009

Feature - Sharing a drink from the data firehose

(Clockwise from top): Nural Akchurin, Sung-Won Lee, Alan Sill and Vanalet Rusuriye examine data transfer and local cluster performance for the Tier-3 center at Texas Tech University while remotely monitoring parameters of the CMS experiment. The mini-Remote Operations Center at TTU keeps the group in close contact with the CMS operations at CERN. Image courtesy Alan Sill, TTU

The Large Hadron Collider will generate a torrential flood of nearly half a Gigabyte of data each second.
It’s too much data to simply record for later contemplation. It would fill your 160 GB iPod in about five minutes, and your 500 GB laptop in about 15 minutes. Instead, physicists will have to filter it, monitor it and analyze it, day in and day out.
That will take the efforts of more than 7500 scientists scattered around the world. Researchers found a way for the physicists who are not at CERN to assist in filtering, monitoring and analyzing the data rem

September 30, 2009

Q & A - Smart data handling: An interview with Tevfik Kosar

Image courtesy Tevfik Kosar

In e-science, we are constantly striving to improve performance and speed so that we can complete a larger number of more complex computations faster. Tevfik Kosar, a researcher at Louisiana State University, is working on two intertwined projects that could together lead to the sorts of improvements we hope for. Read on to find out what he had to say.
iSGTW: You just received a National Science Foundation grant to work on the Stork Data Scheduler. Can you tell us a little about that project?
Kosar: The funding for the development of Stork Data Scheduler comes from NSF's Strategic Technologies for CyberInfrastructure program. The STCI program funds innovative cyberinfrastructure services which have the potential to significantly advance research capabilities in multiple areas of science and engineering. The grant will provide three years of funding for the enhancement of the Stork Data Scheduler

September 30, 2009

Virtualizing Rome in a day

A screenshot of the 3-D model of Rome's Colosseum, generated using several thousand Flickr photos. The black points surrounding the Coliseum indicate the vantage points from which photographs were taken. Image courtesy Build Rome in a Day project

Researchers at the University of Washington generated a 3-D model of Rome’s greatest landmarks in a day — and they did it by standing on the shoulders of Flickr users.
The researchers applied their algorithms to 150,000 Flickr images of Rome. It took a computer cluster containing 496 cores 13 hours to match common points in the images, sorting them into groups based on the landmark depicted. Another eight hours of computation time, and they had 3-D models of landmarks such as the Colosseum, St. Peter's Basilica, Trevi Fountain and the Pantheon.
The Building Rome in a Day project, which is still in progress, aims to build a parallel distributed system that can use Flickr photographs to reconstruct

September 23, 2009

Feature – New organization shakes up earthquake consortium

Earthquake engineers at University of Nevada, Reno test a 110-ft bridge model to failure. Image courtesy of Joan Dixon/University of Nevada, Reno.

We cannot stop earthquakes and tsunamis from happening. But with well-engineered buildings, we can prevent some of the death and damage these natural disasters leave in their wake.
First, however, engineers must understand how buildings react when shaken by earthquakes or pummeled by tsunami waves. To accomplish that goal, researchers use a combination of specialized equipment: giant tables that shake, wave tables filled with water, and high-end computing resources that can simulate just about anything.
To find out how sound a building will be during an earthquake, researchers can build a model on top of a large shake table. But most of the shake tables in the United States are not large enough to accommodate an entire building. Instead, they accommodate individual building com

September 23, 2009

Project develops new standards for sharing between grids

Authorization Interoperability Project members , left to right, Oscar Koeroo (NIKHEF), Gabriele Garzoglio (Fermi National Accelerator Lab), and Frank Siebenlist (Argonne National Laboratory). Photo courtesy Open Science Grid.

Although the Grid is all about resource sharing, the software that governs individual grids has not always been capable of interacting well. The Grid Authorization Interoperability Project has created a new standard that could change that.
Grids make their computational and storage resources available online for use by others through software known as gateway middleware. To access a grid, a user presents her credentials—certification that she has rights to access that grid’s resources—to a resource gateway. The gateway in turn talks to an authorization system, local to the grid the user is accessing, in order to assign the appropriate privileges to the user.
Most grids have independently d

September 23, 2009

Nanoporous materials for green technology

Pictured here, a metal-organic framework with pores of approximately 1.4 nanometer in diameter.  Image courtesy of David Dubbeldam.

A new class of materials with nano-scale pores could help to improve hydrogen fuel cells or reduce auto emissions.
Using a variety of TeraGrid resources, researchers at Northwestern University and Kansas State University were able to model and evaluate metal-organic frameworks to see how these materials will perform under specific circumstances. They found that metal-organic frameworks act much like sponges, soaking up hydrogen gas and storing it in their nano-sized pores. In fact, these metal-organic frameworks are capable of soaking up far more hydrogen gas than could normally occupy the same amount of space.
To learn more about this research and its further applications, visit the research group’s website.

September 23, 2009

NCSA signs onto Facebook

Image courtesy of NCSA.

All the cool kids are on Facebook, and the National Center for Supercomputing Applications, based at the University of Illinois, is no exception.
NCSA's Facebook fan page came online in mid-July. The page, which so far has 124 fans, is host to regular updates, job posts, videos, a photo gallery and some links.
 

Visit the NCSA Facebook fan page

Visit NCSA at its homepage

 
 
 
 
 

September 16, 2009

Feature - Visualizations go big in planetarium show

An image from Toomre and Brown's visualization of the magnetic field of the solar convection zone. Here, the sunspot is magnified for better visibility, and is not to scale relative to the sun. © 2009, American Museum of Natural History

"Journey to the Stars" is currently showing at the American Museum of Natural History’s Hayden Planetarium in New York City, US
The stars are writ large in all their majesty in “Journey to the Stars,” a planetarium show that uses grid-generated simulations to take audiences deep under the surface of the sun.
With Whoopi Goldberg as a guide, viewers embark on a journey through the lifespan of stars and the origin of life. Visualizations of the universe, projected onto the 87-foot seven-million-pixel dome of the Hayden Planetarium in New York City, explain how stars first formed and then exploded to produce the chemical elements that make life possible.
The 25-minute journey culm

September 16, 2009

Newsflash - Fall conference line-up

At the EGEE conference in Barcelona, attendees can take sessions on ‘Grids, new media and video’ and ‘From abstract to international news story.’ Image courtesy stock.xchng

The fall conference season will take iSGTW readers to the shores of the Mediterranean in Barcelona, the heights of the Canadian Rocky Mountains in Banff, and the banks of the Columbia River in Portland. Read on to find out more about some of the booths and workshops you’ll find at each conference!
Enabling Grids for E-sciencE ’0921-25 September, Barcelona, Spain
Next week (21-25 September), the bulk of the European grid community will gather in Barcelona for the final conference of Europe’s flagship computing grid project, EGEE.
“With the transition from EGEE to the new European Grid Initiative at the forefront of everyone's minds, this final EGEE conference will be the perfect time for members of the grid community to promote their wo

September 9, 2009

Feature - Calming the wakefield

A snapshot of a simulation of the wakefield generated by a particle bunch moving through a series of ILC cavities, from three different perspectives. The colors represent the magnitude of the fields, with warmer colors representing the strongest fields.

For the International Linear Collider to run at maximum performance, each of its 27,000 cavities must be designed as precisely as possible.
It is very time consuming and costly, however, to produce physical prototypes, so researchers at SLAC National Accelerator Laboratory decided to use a supercomputer to create and test virtual prototypes of the cavities.
The ILC, which is in its design phase, will use superconducting cavities to accelerate electrons and their antimatter partners, positrons, to nearly the speed of light before colliding them. By studying these collisions, researchers will be able to probe more deeply into the subatomic world.
As particle bunches travel through the accelerator cavities,

September 2, 2009

Announcement – Fifth International Symposium on Computational Wind Engineering now accepting abstracts

The Fifth International Symposium on Computational Wind Engineering (CWE2010) is now accepting abstracts for posters and oral presentations.
CWE2010, which will take place 23-27 May in Chapel Hill, NC, “will provide a platform for discussing and exchanging the latest information associated with the application of computational fluid dynamics (CFD) simulations to wind engineering problems and the tremendous advances in CFD technology in the past several years,” according to the event’s website.
The theme for 2010 is computational wind engineering applications for homeland and societal security. Four related plenary sessions are planned on the following topics:

Applying Computational Wind Engineering to Practice: Perspectives from the Political, Academic, Corporate, and Public Sector Community
Trends in High Performance Computing for Wind Engineering
Development,

September 2, 2009

Announcement – New program offers funding for up to 11 postdoctoral fellows

In a bid to stimulate research in computational science and engineering, Argonne National Laboratory has announced the launch of the Computational Postdoctoral Fellowship program.
According to the announcement, fellows will develop and implement advanced computational approaches aimed at scaling applications for high-end computing systems, and conduct large-scale simulations in their scientific discipline. Towards that end, they will have access to the resources of the Argonne Leadership Computing Facility, including a 557-teraflop IBM Blue Gene/P system, and Jazz, a teraflop-class computing cluster.
The program welcomes proposals for research in a wide variety of science and engineering disciplines, including biology, chemistry, earth science, engineering, materials science, nuclear energy, physics, and energy science.
To apply, candidates who have completed their doctoral requirements should send their Curr

September 2, 2009

Feature - New batch of science gateways hit the spot

A visualization of a gauge configuration generated at NERSC which is now freely available via Gauge Connection, a NERSC science gateway that serves as an experimental gateway for the lattice quantum chromodynamics community. Image courtesy of NERSC.

Accessing high performance computing resources via the web can be as easy as everyday tasks such as paying bills, shopping and chatting with friends, thanks to a new Science Gateways project. But it wasn’t always that way.
The traditional method of accessing computing center resources is to log in, write and submit a mini-computer program called a batch script, and then wait for the results. Science gateways allow researchers to accomplish the same tasks using a web-based graphical user interface (if it can use a mouse, it’s a graphical interface). This means that scientists can dive right into the science without worrying about learning how to write a batch script.
“Th

September 2, 2009

Link of the week – Flashback to 1967: computers are just hype

A screen grab of the story's opening spread.  Image courtesy of Modern Mechanix blog.

1967. Charlie Chaplin is out and Jimi Hendrix is in. It may be the summer of love, but it is also a year of riots and violence. While war rages in Vietnam, the space race is picking up speed.
In that atmosphere, it’s hardly surprising to see well-written, well-researched literary journalism appearing in Playboy. Max Gunther’s “Computers: Their built-in limitations” mixes piquant wit with an easy writing style that draws the reader in, creating an extremely well-crafted piece.
Too bad his criticisms of computing were so off the mark that today, his commentary has become unintentional satire.
There are many parts that may bring a smile to your lips. Here are a few gems:

Computers are just a status symbol. “To have a computer is ‘in.’ Even if you’re a scruffy little company that no

September 2, 2009

Video of the Week - Earthquake simulation wins SciDAC award

The many visualizations that come out of high performance computing centers can be fantastically beautiful. This simulation of a 7.8 magnitude earthquake in Southern California is no exception.
That's probably why it was recognized recently at the SciDAC (Scientific Discovery through Advanced Computing) Vis Night awards as one of the top ten scientific visualizations of 2009.
The simulation, which has been affectionately dubbed 'The Big One,' used resources via the San Diego Supercomputer Center, the Texas Advanced Computing Center, the Southern California Earthquake Center, and Teragrid.
This video of The Big One was recently posted on the WIRED Science Blog, along with the other nine award winners. Check out the rest, which are just as fantastic, at www.wired.com/wiredscience.
—Miriam Boon, iSGTW

August 26, 2009

Feature - Improving Alzheimer’s research, a million scans at a time

Alzheimer’s is one of the most feared diseases associated with aging. Fortunately, early detection can slow its development. Image courtesy stock.xchng

As you read this, your brain is busily working. In a complex but unconscious process, it scans the pixels on your screen, analyzes the images and turns them into meaningful information.
This week, a similar work began, neuGRID, that might help keep our minds whirring wonderfully as we (alas) age.
This massive scanning project will feed 6,300 magnetic resonance (MR) scans from more than 700 patients — a bit less than 200 images per patient, making for an impressive total of 1,260,000 images — through an automated series of calculations. This “pipeline” will analyze the cortical thickness of the brain (a measurement aligned with brain health) and its deterioration over time. The images are from the Alzheimer’s Disease Neur

August 26, 2009

Image of the Week - Watch a tornado

Large-Eddy simulation of a tornado's interaction with the surface. Image courtesy Pittsburgh Supercomputing Center

Just what does the interior of a tornado look like as it swirls over the land?
To find out, researchers W.S. Lewellen, D.C. Lewellen and Aytekin Gel of West Virginia University made high resolution, fully 3-D simulations in an attempt to answer questions about the character of the turbulent eddies in this unique flow.
The following animated clip, made by Gel in close cooperation with the Pittsburgh Supercomputing Center’s (PSC) Scientific-Visualization Group, uses particle advection to represent wind direction and isosurfaces to show pressure inside a 400 m x 400 m x 400 m domain from a simulation involving approximately 100 hours of a Cray C90 supercomputer at PSC.
As a resource provider in TeraGrid, a National Science Foundation program of coordinated cyberinfrastructure for education and research, PSC works with its T

August 19, 2009

Feature - Recovery Act funds speed up high-speed ethernet

Photo courtesy of Phil Edon, stock.xchng.

ESnet will build the world’s fastest supercomputing network and test subnetwork for future technology using $62 million in American Recovery and Reinvestment Act funds.
ESnet, which is based at Lawrence Berkeley National Laboratory, described their plans for the network in an announcement on the 10 August. Dubbed the Advanced Networking Initiative, it will serve as a pilot for 100 gigabit per second ethernet technology.
“We’re moving to 100 gigabits because the standard today is 10 gigabits, and we already have individual streams of data that are bumping against that limit,” said Steve Cotter, ESnet department head, in a recent interview. “We’d like to have a system out there that can handle more.”
The Initiative will build 100 gigabit connections between the National Energy Research Scientific Computing Center at Lawrence Berkeley National Labo