Share |

Content about Americas

January 20, 2010

Announcement: Internet2 meeting: proposals due

Internet2 Spring Member Meeting Call for Participation Deadlines Approaching
Emerging trends in cyberinfrastructure development and new federal stimulus funding opportunities will share center stage when the Internet2 community gathers in Arlington, Virginia, for its annual Spring Member Meeting, 26-28 April 2010. The conference serves as a forum for sharing information and best practices that address issues critical to meeting the advanced networking needs of researchers, educators, industry, and students.
Proposals for track sessions and side meetings are due 29 January 2010; poster session submissions are not due until 31 March 2010. Based on trends within the research and education community, the meeting’s program committee hopes to focus this year’s conference on topic areas including: federal broadband policy, healthcare networks, identity management federations, cloud computing, real time collaboration tools, green techn

January 20, 2010

PEGrid gets down to business

This image depicts a simulation of the water saturation changes in a quarter of a homogeneous oil reservoir over time, as water is injected. The water increases the pressure in the reservoir, pushing the oil to the surface. Because the reservoir is symmetrical, researchers were able to save time by simulating only one quarter of the reservoir.
The colors indicates the water saturation, with purple being highly saturated and red being least saturated. Each of the six slices represents a snapshot in time.Image courtesy of Shameem Siddiqui.

Although Petroleum Engineering Grid officially started up in late 2009, it is already a classic example of how research projects can have unexpected benefits.
PEGrid came into existence using techniques and tools created by TIGRE (Texas Internet Grid for Research and Education), a state-funded project based on the use of federally funded middleware such as Open Science Grid’s Virtual Data Toolkit.
When TIGRE ende

January 20, 2010

Image of the week - Seeing with lasers

Top: LiDAR point cloud data for the Old Faithful area of Yellowstone National Park. Data source is EarthScope LiDAR hosted by the OpenTopography Facility. Image shows approximately 8.3 million individual LiDAR returns. The historic Old Faithful Inn is the structure at left. The Old Faithful Geyser is in the middle of the image.
Bottom: LiDAR digital elevation model (DEM)-derived image of Fish Springs cinder cone and the Owens Valley fault in eastern California produced from OpenTopography-hosted EarthScope LiDAR data.
Credit: Christopher Crosby, SDSC. Source: San Diego Supercomputer Center, UC San Diego.

The data used to create these images was gathered using LiDAR (Light Detection and Ranging), a technology that utilizes lasers to record precise and extremely high resolution topographical information.
The San Diego Supercomputer Center′s OpenTopography portal provides free access to LiDAR data sets, including the EarthScope data

January 13, 2010

Announcement - Last chance to apply for computational time on TeraGrid, (midnight January 15, submitter’s local time)

Scientists, engineers, and other U.S. researchers may apply until January 15, 2010 (12:00 midnight local time) for the next quarterly review of requests for free allocations of high-performance computer time, advanced user support, and storage resources that are available through the National Science Foundation (NSF) Office of Cyberinfrastructure’s (OCI) TeraGrid. To apply for an allocation of any size, please visit TeraGrid’s online submission system.
Each quarter, a panel of computational experts known as the TeraGrid Resource Allocations Committee (TRAC) evaluates requests primarily on the appropriateness and technical aspects of using TeraGrid resources. Applications received by the January 15 deadline will be considered at the March 2010 TRAC meeting, and awards will be available for the one-year period from April 1, 2010 through March 31, 2011. TeraGri

January 13, 2010

Feature - Transferring FTP to the cloud: Off of desktop, out of mind

Kettimuthu’s team chose the 10 terabyte data set shown in this image for the Bandwidth Challenge. The data, which comes from the World Climate Research Program Coupled Model Intercomparison Project, simulates temperature change at the Earth’s surface and zonally-averaged throughout the atmosphere from 1900-2100. Image courtesy of Rajkumar Kettimuthu.

Don’t shut down. Don’t reboot. Don’t disconnect. And don’t even think about closing the window. Securely and rapidly transferring large amounts of data with GridFTP comes with a lot of “don’ts.”
That’s why the Globus Alliance team led by Steve Tuecke, a researcher at Argonne National Laboratory and the University of Chicago, decided to create a hosted data movement service dubbed Globus.org.
Take a 10 terabyte transfer using GridFTP as an example. “On a typical network it takes about two days,” said

January 13, 2010

Image - Separating the real from the fake

“Return of the Hunters,” an image known to be made by Pieter Bruegel the Elder, a Dutch master active during the region’s ‘golden era’ in the 1500s. Oxford University art historian Martin Kemp described the painting in Nature as “testimony to the scientific observation of light and geographical features.” Original image in Kunsthistorisches Museum, Vienna, Austria

 
Pieter Breugel the Elder was a very popular painter, with a huge body of work that was closely imitated — so, there are a lot of outright forgeries.
Now, however, researchers at Dartmouth College, New Hampshire, have found a new way to separate the real from the fake, by mathematically analyzing images that are known to be genuine, such as “Return of the Hunters,” and comparing to images whose authenticity is questionable.
Writing in the Proceedings of the National Academy of Sciences (PNAS), Daniel Rockmore describe

January 6, 2010

Feature - EELA-2 Conference

Fernando Liello gives a keynote at the 2nd EELA-2 Conference in Choroni, Venezuela. Image courtesy of EELA-2.

On 25-27 November, researchers from Latin America and Europe descended on the town of Choroní, Venezuela for the second EELA-2 Conference, hosted by the University of Los Andes and CeCalCULA.
Financed by the European Commission, the EELA-2 Project (E-science grid facility for Europe and Latin America) has the objective of promoting grid technology in Latin America and Europe. EELA, which began four years ago, is a high capacity grid with quality production. Using the RedCLARA and GÉANT2 networks, EELA also created a digital bridge between Europe and Latin America.
Over this cooperation network the EELA-2 grid is supporting the development and testing of several advanced applications that are serving to improve not only scientific and technological development, but also everyday lives. The science supported by EELA-2 ranges from analyzi

December 16, 2009

Announcement - Call for papers: First ACM Symposium on Cloud Computing

The ACM Symposium on Cloud Computing 2010 is the first in a new series of symposia with the aim of bringing together researchers, developers, users, and practitioners interested in cloud computing. This series is co-sponsored by the ACM Special Interest Groups on Management of Data and on Operating Systems. ACM SOCC will be held in conjunction with ACM SIGMOD and ACM SOSP Conferences in alternate years, starting with ACM SIGMOD in 2010.
The scope of SOCC Symposia will be broad and will encompass diverse systems topics such as software as a service, virtualization, and scalable cloud data services. Many facets of systems and data management issues will need to be revisited in the context of cloud computing. Suggested topics for paper submissions include but are not limited to:

Administration and Manageability
Data Privacy
Data Services Architectures
Distributed and Parallel Query Processing
Energy Management
Geogra

December 16, 2009

Announcement - ParaPLoP 2010 accepting papers

Parallel programming enthusiasts are invited to submit papers for ParaPLoP 2010, a PLoP-style workshop on parallel programming patterns to be held 30 March-1 April in Carefree, Arizona. ParaPLoP 2010 also offers an opportunity for authors to participate in developing OPL – Our Pattern Language.
The Universal Parallel Computing Research Centers at Illinois and Berkeley have been collaborating to develop OPL, a pattern language for parallel programming. OPL covers the entire process of developing a parallel program which begins with patterns for software architecture and for problem domains that often require parallelization. It continues with patterns for parallel algorithm design, for common structures for implementing parallel algorithms, and low-level patterns for controlling concurrent execution. The project is more ambitious than Design Patterns and has the potential to make an even bigger impact. The collaborators are looking for peopl

December 16, 2009

Feature - GRAPPAling with evolutionary history

This figure illustrates how gene order changes among the eight species. Each thin line represents a single gene and its position in the different species. Most genes are conserved on the same chromosomal arm or Muller element, but gene order is shuffled between species. This figure appeared in the July 2008 issue of Genetics. Image courtesy of Arjun Bhutkar, Stephen Schaeffer et. al., with permission from The Genetics Society of America.

We’ve known for several years now that chimpanzees share 96 percent of our DNA. Our technology tells us how closely humans and chimps are related. But it doesn’t tell us how we’re related. We need new technology for that.
Enter GRAPPA – or Genome Rearrangements Analysis under Parsimony and other Phylogenetic Algorithms if you want a mouthful. GRAPPA has already been used to analyze the evolution of organelles such as chloroplasts and mitochondria, running on cluster computers with

December 16, 2009

Feature - An interview with the “Particle-Zoo Keeper”

Julie Peasley presents her particle zoo during a visit to the CERN library. She has created a plush toy version of the decay of a top quark for Fermilab. One of the artist’s plush toy can reverse inside-out with a zipper, and open up to reveal a big bottom quark with a mini anti-muon and a mini muon-neutrino. Image courtesy CERN

There are many different ways to depict the workings of the Standard Model — typically involving things such as blackboard drawings or 3D computer animations. However, a Los Angeles artist has come up with a new,  unorthodox approach, using hand-sewn, fuzzy, plush toys, which she collectively describes as her “particle zoo.” A behind-the-scenes look into her work seemed  appropriate for the holiday season.
Physicists consider that they have “seen” a particle when their detectors send an electronic signal and a spot appears on their compu

December 9, 2009

Announcement - US eHealth Collaborative seeks board members

Image courtesy NeHC.

National eHealth Collaborative is now accepting nominations for leaders in the health and healthcare fields to serve on the NeHC Board of Directors.
National eHealth Collaborative is a public-private partnership developed through an open, multi-year, multi-stakeholder process and operates under a cooperative agreement with the Office of the National Coordinator for Health Information Technology (ONC). As NeHC turns the corner into its second year, NeHC leaders are looking to their many stakeholders for recommendations on who should be asked to fill vacancies on the NeHC Board of Directors.
“By encouraging the broad adoption and use of health information technologies and electronic health information exchange, we help to create a true patient-centered health system. Public and private sector efforts to expand funding and develop policies and standards are having a significant impact in driving this mo

December 9, 2009

Case study: The GeoChronos web portal

Surface reflectance and ocean temperature, an example of Earth observation science. Image courtesy of Jacques Descloitres, MODIS Land Rapid Response Team, NASA/GSFC.

When GeoChronos launches, it will serve up a buffet of scientific and social networking ingredients that together empower Earth observation scientists to collaborate and make new discoveries.
The GeoChronos recipe didn't come out right the first time, however. The path the GeoChronos team has followed provides valuable insight into the process of creating a scientific web portal.
“The idea is that scientists can come to a portal where they process and share their data without having to worry about the overall technical details of how that’s being done,” said Cameron Kiddle, a research fellow for the Grid Research Centre at the University of Calgary in Alberta, Canada.
Social networking features and collaborative tools are a must for the project, and so the first GeoC

December 9, 2009

Feature - Observing oceans online Overview map of the NEPTUNE Canada observatory off the west coast of Vancouver Island, British Columbia. The network, which extends across the Juan de Fuca plate, will gather live data from a rich constellation of instruments deployed in a broad spectrum of undersea environments. This system will provide free Internet access to an immense wealth of data, both live and archived throughout the life of this planned 25-year project. Image courtesy NEPTUNE Canada. Although the Earth is mostly water, scientists know relatively little about the ocean floor. But with the creation of ocean observatories such as NEPTUNE Canada, all that could change. Until recently, scientists had to use cruise ships, satellites, and temporary probes to study the world’s oceans. This allowed them to take occasional snapshots of the ocean for later study. Ocean observatories are made up of more permanent installations of instruments directly on the ocean floor, along the co

December 9, 2009

Image of the week - Supercomputing 2009 exhibition floor

The exhibition floor at the annual Supercomputing conference is always a sight to see, and SC09 was no exception. To make a record of it, iSGTW endeavored to snap photos of as many research-related booths as possible. Whether you were at SC09, or you missed it, we hope you will enjoy this chance to virtually explore the exhibition hall.

December 2, 2009

I see crime scenes

IC-CRIME’s laser scanner technology will allow investigators to accurately record room and object dimensions, as well as the placement of every piece of evidence in a crime scene.Image courtesy NCSU

Fighting crime with science isn’t as simple as popular TV shows like CSI would have you believe. But those shows get one thing right: science and technology have a tremendous potential for changing the way crime is investigated.
Today, investigators record crime scenes using sketches and photographs. Bullet trajectories are determined using lasers or even lengths of string. “They take a microsnapshot,” said Mitzi Montoya, a North Carolina State University researcher who specializes in knowledge and virtual team management. “You don’t really know what’s relevant in a crime scene, and you can’t go back and create it, because once it’s cleaned it’s gone forever.”
That could change if the IC-CRIME (Interdiscip

December 2, 2009

Feature - Predicting burglary with the grid

Photo courtesy Andy Fox, stock.exchng

Superheroes like Batman are not the only ones who can make use of sophisticated technology to fight crime. Nick Malleson, a researcher at Leeds University, has designed an intricate computer model to forecast burglary rates, which relies on the UK’s National Grid Service (NGS) to provide the necessary computing power.
Predicting crime is a tricky business, because the likelihood of a burglary can depend upon numerous human and environmental factors, all of which affect one another. In an attempt to forecast general trends, Malleson’s model simplifies the complexities surrounding crime prediction by using an “agent-based” model — one in which largely autonomous individuals, or “agents,” make decisions and perform actions which are influenced by the multiple individual factors within their environment.
In his model, potential burglars make decisions a

November 18, 2009

Feature - HSVO connects the dots A screen capture of HSVO's patient simulator user interface. This mock-up of the patient simulator used videos from a training scenario in which students had to save the life of a teenager severely injured during a basketball game. An advanced mannequin stands in for the teenager. During this particular scenario, the students and mannequin were located in Montreal, the mannequin operator and a tutor were in Ottawa, and another tutor was located in Sudbury, Ontario. Image courtesy of McGill University and HSVO. Don’t let the name of Health Services Virtual Organization fool you. If HSVO is a success, it will be proof of concept for generic middleware that enables cloud-based workflows to access any number of services. And that could have implications for any scientific field. Web portals that give researchers access to data, services, applications and computational resources are becoming increasingly common. Researchers can access a variety of servi

November 18, 2009

Image of the week - What do these pictures have in common?

Photo courtesy SOHO

In answer to our question, all of these images are of our sun, and were taken by SOHO — the Solar and Heliospheric Observatory, a project that is an international collaboration between the European Space Agency and NASA to study the sun from its deep core to its outer corona and its solar wind. Many of the images were modeled using a synthetic modeling program that allows astronomers to determine the temperature of surface layers, the sun’s chemical composition, and the relative abundance of the various elements. Known as SYNTSPEC, the modeling program is run on Baltic Grid and Lit Grid. (Click to enlarge.) Image courtesy SOHO 

November 11, 2009

Feature - Big science facilities meet the cloud

Dylan Maxwell explains the Science Studio system to a bystander at Summit 2009 in Banff, Alberta. Photo by Miriam Boon.

Lab notebooks are so passé. In the brave new world of cloud computing, the entire experimental process will take place in your web browser.
And if a team of Canadian researchers at the University of Western Ontario and the Canadian Light Source in Saskatchewan has anything to say about it, researchers around the world will be using a web platform called Science Studio.
“One of the aims of Science Studio is to be able to access big science facilities such as the Canadian Light Source,” said Marina Fuller, a chemistry researcher with the project. “It’s a complete experiment management system.”
The test case for Science Studio is the VESPERS beamline at the Canadian Light Source synchrotron. When Science Studio is complete in 2011, researchers will be able to use the platform to apply

November 4, 2009

Feature - Getting GPUs on the grid

Russ Miller, principal investigator at CI Lab, stands in front of the server rack that holds Magic, a synchronous supercomputer that can achieve up to 50 Teraflops. Image courtesy of CI Lab.

Enhancing the performance of computer clusters and supercomputers using graphical processing units is all the rage. But what happens when you put these chips on a full-fledged grid?
Meet “Magic,” a supercomputing cluster based at the University of Buffalo’s CyberInfrastructure Laboratory (CI Lab). On the surface, Magic is like any other cluster of Dell nodes. “But then attached to each Dell node is an nVidia node, and each of these nVidia nodes have roughly 1000 graphical processing units,” said Russ Miller, the principal investigator for CI Lab. “Those GPUs are the same as the graphical processing unit in many laptops and desktops.”
That’s the charm of these chips: because they are mass-manufactured for use in your

October 28, 2009

Announcement - IPDPS 2010, 19-23 April, Atlanta, USA

For five days in April, Atlanta, Georgia will be host to the 24th IEEE International Parallel and Distributed Processing Symposium.
According to the website, “IPDPS is an international forum for engineers and scientists from around the world to present their latest research findings in all aspects of parallel computation. In addition to technical sessions of submitted paper presentations, the meeting offers workshops, tutorials, and commercial presentations and exhibits.”
The event will take place 19-23 April at the Downtown Sheraton Atlanta. Although the call for papers has already closed, a number of workshops will take place at the conference, most of which are accepting abstracts or papers through the end of November. These include:

Heterogeneity in Computing Workshop
Reconfigurable Architectures Workshop
Workshop on High-Level Parallel Programming Models & Supportive Environments
Workshop on Nature Inspired D

October 28, 2009

Feature - Dash heralds new form of supercomputing

Dash, pictured here, is an element of the Triton Resource, an integrated data-intensive resource primarily designed to support UC San Diego and UC researchers. Image courtesy of San Diego Supercomputer Center, UC San Diego.

The first of a new breed of supercomputers was born this fall when computer experts combined flash memory with supercomputer architecture to create Dash.
Normally supercomputers are measured by how many floating point operations, also known as “flops,” they can complete per second. And at a peak speed of 5.2 teraflops, Dash wouldn’t even make the top 500 list, where the slowest speed is about 17 teraflops.
“But if you look at other metrics, such as the ability to do input/output operations, it would potentially be one of the fastest machines,” said Allan Snavely, the project leader for Dash. “Dash is going after what we call data-intensive computing, which is quite different from

October 28, 2009

Feature - In case of emergency, call SPRUCE

A part of the complete synthetic social contact network of Chicago obtained by integrating diverse data sources and methods based on social theories. This sort of simulation can bring insight into how a virus will transmit through a population. Check out this SciDac Review article for more information about this research. Image courtesy of Madhav Marathe and SDSC.

When disaster strikes, simulations could give authorities the information they need to save lives. But simulations are computationally intensive, and during a crisis, there’s no time to wait in line for access to computer resources. That’s where urgent computing comes in.
“What you really want is to be able to hook together or have access to all the supercomputers that you need, wherever they are,” said Pete Beckman, project lead for TeraGrid’s Special PRiority and Urgent Computing Environment, or SPRUCE. “The purpose of this sort of urgent com

October 21, 2009

Feature - Here to help: embedded cyberinfrastructure experts

It isn’t easy designing software that can run on a cluster like Fermilab's Grid Computing Center. That’s why advanced technical support is so essential. Photo by Reidar Hahn, Fermilab Visual Media Services.

Although much of today’s scientific research relies on advanced computing, for many researchers learning how to adapt and optimize applications to run on supercomputers, grids, clouds, or clusters can be daunting.
To help newcomers, many cyberinfrastructure providers offer in-depth support tailored to fit each user’s needs. This is much more than the typical technical support that helps users write scripts to enable their jobs to run. Instead, cyberinfrastructure experts are embedded directly into a user team to provide longer-term assistance.
One example is TeraGrid User Support and Services, led by director Sergiu Sanielevici.
“The designation of a supercomputer is that it’s basicall