Share |

Content about Americas

August 11, 2010

 

Link of the Week - A new twist on summer camp: computing classes in the wild

Image courtesy Carlos Jaime-Barrios Hernandez

We’ve all heard of summer camp.But SuperComputing Camp (or SSCAMP, as it is known by its acronym in Spanish) is a little different.
Starting on the 15th of August, 46 undergraduates and masters students will learn about high performance computing, grid computing, volunteer computing and cloud computing — while staying in a hacienda near Panachi National Wildlife Park, just outside the small town of Piedecuesta, Colombia.The organizer, Carlos Jaime-Barrios Hernandez, says the idea is for students to learn in a natural environment, where they can explore and enjoy the great outdoors while having access to fully up-to-date facilities, including digital resources, projectors and live-video feeds to keynote speeches and online lectures. They will remotely connect to the grid infrastructure via the web. Hernandez — a research scient

August 11, 2010

Video of the week - Learning with multi-touch

Multi-touch technology has been around longer than you might think; experimental implementations have been surfacing since the early 1980s. This technology really hit the big time, however, when Apple released the first iPod Touch.
In 2008, the Renaissance Computing Institute at University of North Carolina-Chapel Hill unveiled a multi-touch table that has since become an invaluable tool in the scientific visualization toolbox. At the same time, learning scientists, computer scientists, and psychologists from Virginia Tech and the University of Chicago formed itemL – interactive technologies for embodied mathematics Learning – and began investigating how young children (three to eight-years-old) interact with a multi-touch play table.
“We are collecting extensive data on the commercially available SMART Table while developing our own technology, TanTab,” explained Michael Evans, assistant professor of learning science and technologies at Vir

August 4, 2010

Announcement - Campus Bridging Technologies Workshop calls for papers, registration

Registration for the Workshop on Software and Services for Campus Bridging, which will take place 26-27 August 2010 in Denver, Colorado, is now open. The organizers of the NSF-sponsored event also invite members of the community to share their experiences by submitting abstracts, which are due 23 August 2010.
This workshop is held under the auspices of the NSF ACCI Campus Bridging Taskforce (CBTF) and focuses on the role of cyberinfrastructure software and services for campus bridging. As laid out in the National Science Foundation’s “Cyberinfrastructure Vision for 21st Century Discovery,” cyberinfrastructure is a key and necessary component to support science and engineering; campus bridging is the integrated use of user-local cyberinfrastructure with other cyberinfrastructure on the user’s campus, at other campuses, and the regional, national, and international levels as if they were proximate to

August 4, 2010

Announcement - International Parallel and Distributed Processing Symposium call for papers

The IEEE International Parallel and Distributed Processing Symposium is now accepting abstracts for its 25th annual conference, which will take place 16-20 May 2011 in Anchorage, Alaska.
Anchorage, home to moose, bears, birds and whales, is strategically located at almost equal flying distance from Europe, Asia and the Eastern USA. Embraced by six mountain ranges, with views of Mount McKinley in Denali National Park, and warmed by a maritime climate, Anchorage offers year-round adventure, recreation, and sporting events. It is a fitting destination for IPDPS to mark a quarter century of tracking developments in computer science. To celebrate the 25th year of IPDPS, plan to come early and stay late and also enjoy a modern city surrounded by spectacular wilderness.
IPDPS is an international forum for engineers and scientists from around the world to present their latest research findings in all aspects

August 4, 2010

Announcement - WORKS10 calls for papers

The 5th Workshop on Workflows in Support of Large-Scale Science, which will take place 14 November 2010 at SC'10 in New Orleans, Louisiana, USA, is now accepting papers.
Scientific workflows are a key technology that enables large-scale computations and service management on distributed resources. Workflows enable scientists to design complex analysis that are composed of individual application components or services and often such components and services are designed, developed, and tested collaboratively.
The size of the data and the complexity of the analysis often lead to large amounts of shared resources, such as clusters and storage systems, being used to store the data sets and execute the workflows. The process of workflow design and execution in a distributed environment can be very complex and can involve multiple stages including their textual or graphical specification, the mapping of the high-level workflow descriptions onto the available

August 4, 2010

Feature - The sun never sets on the GreenStar Network The GSN project is lead by Quebec Ecole de technologie superieure in Montreal. In this picture, the team behind the GreenStar Network pose next to the Communications Research Centre Canada's GSN node. From left to right: Martin Brooks, Mathieu Lemay, Michel Savoie, John Spence, Bobby Ho. Image courtesy of John Spence. When the sun sets on the Communications Research Centre in Ottawa, Canada, the solar-powered computational jobs might be sent across the high-speed connection to the Cybera data center in Calgary, where it’s still bright and sunny. And when the sun stops shining in Calgary, if the wind is blowing at the wind-powered BastionHost facility in Truro, Nova Scotia, then the jobs could be sent back east. Most forms of renewable energy are not reliable – at any given location. But Canada’s Green Star Network aims to demonstrate that by allowing the computations to follow the renewable energy across a lar

August 4, 2010

 

Link of the Week - Move over, Deep Blue: Watson is here

Watson (center) competes against humans in a mock Jeopardy match. To see the video this image was taken from, click on the image.
Screenshot taken by Miriam Boon. Video courtesy of IBM.

To make artificial intelligence history, Deep Blue had to defeat chess grandmaster Gary Kasparov. Now those zany IBM AI researchers are at it again, pitting their latest experiment, Watson, against champions of a much more challenging game: Jeopardy.
Watson, which runs on a Blue Gene computer, uses a variety of separate algorithms to search its memory for answers to the clues posed on the show. Then, it combines the results each algorithm returns - taking certainty into consideration - to come up with an answer. This technique has made a huge difference, according to the team lead, David Ferrucci, as quoted in a great article that appeared in the New York Times in June.

Whether Watson will win when it goes on TV in a real “Jeop

August 4, 2010

Opinion - A round-up of the grid in Chile

Palacio de la Moneda, the seat of government in Chile. Image courtesy Max Cossio, stock.xchng.

I was lucky enough to be a guest of REUNA in Chile, and make a presentation at their eScience Workshop in Santiago.
REUNA is a not-for-profit organization connecting 16 universities in Chile, working to promote and advance grid computing in the region and bring together science and business to exchange ideas. REUNA is the original NREN in Chile, connecting the country to the 20,000 gridified institutions around the world via RedCLARA and the ALICE2 project.
The organization has been investigating the state of the art in the region, setting up grid pilot projects in areas such as climate change, bioscience and grid technology, in which they transfer expertise through training and setting up an action plan for the future.
Many other areas of research are ripe for grid computing, including astronomy, engineering, education and physics. REUNA

July 21, 2010

Announcement - OSG Site Administrator's Workshop

The next OSG site administrator's workshop will be held at Vanderbilt University in Nashville, Tennessee 10-11 August.
The workshop will overlap with the CMS Tier 3 workshop, which takes place 11-12 August in the same location. The focus will be on upgrading OSG CE's and SE's with the OSG 1.2 production release. The focus is on practical, hands-on issues relating to OSG site administration.
Please visit the twiki page to see reference material and tutorials that will be used during the session.
Click here for the planning page on the Twiki. 

July 21, 2010

Announcement - OSG Storage Forum, 21-22 September

OSG Storage Forum will be held at the University of Chicago 21-22 September 2010.
Discussions will cover scalability, performance and tuning of the storage technologies adopted by T2 and T3 sites. We will also discuss analysis jobs storage requirements.
The storage software developers and expert storage administrators will be invited to share tips on how to improve stability and scalability of deployed storage systems. OSG storage administrators and representatives of various Virtual Organizations are encouraged to share their experiences and discuss common problems.
For more information, please visit the event’s Indico website.

July 14, 2010

Feature - Chat live with experts today

Welcome to iSGTW’s live chat page, where experts discussed everything you ever wanted to know about matching resources with projects but didn’t have a chance to ask.
The live chat is over, but we invite you to join our panelists at Nature Network over the next week or so to continue the discussion and ask more questions. You can also see the archived chat by clicking on the button below, entitled “Chat Log.”

(function($)
{
$(document).ready(function()
{
var slides = $('div.gallery_content[id*=box]');
var heads = $('#gallery_nav a');

heads.each(function(i)
{
if (i != 5)
{
var self = $(this);

self.click(function(e)
{
e.preventDefault();
heads.removeClass('active');
self.addClass('active');
slides.hide();
$('#box' + (i + 1)).show();
});

if (document.location.hash == self.attr('href')) self.click();
}
});
});
})(jQuery);

#gallery_holder {
width

July 14, 2010

Feature: Fruitfly + flight studies + grid = flying robots?

The GUI pre-processor used to generate the geometry of the fly wings. (Click on image to enlarge.) All images courtesy Diego Scardaci, INFN

The study of the flight of a fruit fly may one day lead to the development of autonomous ‘Micro-air vehicles’ (MAVs), or independent ‘flying robots,’  if scientists in Argentina have their way.
Working in conjunction with the Italian Institute of Nuclear Physics (INFN) under the EELA-2 initiative, they are conducting the study to understand the flight mechanisms of insects and small birds. Their hope is that someday, 'flying robots' could be developed with maneuvering capabilities similar to insects — the most agile flying creatures on Earth.
MAVs could be used to study and explore places that are too dangerous or inhospitable for humans to tread; examples include search-and-rescue teams exploring buildings to detect fires, or scientists investiga

July 7, 2010

Feature - Volunteer computing helps rescue oiled Gulf Coast wildlife

Map indicating position fo Deepwater Horizon oil spill as of June 8, and globally important bird areas considered most at risk. Image courtesy American Bird Conservancy. Click on image to enlarge.

iPhone users who come upon oiled birds and other wildlife in the Gulf Coast region can immediately transmit the location and a photo to animal rescue networks using a free new iPhone application called MoGO (Mobile Gulf Observatory). It was developed by four University of Massachusetts-Amherst researchers to make it easier for the public to help save wildlife exposed to the oil spill in the Gulf of Mexico. With support from the National Fish and Wildlife Foundation, the UMass-Amherst researchers hope the MoGO app will draw on the large network of “citizen scientists” who are as heartbroken as they are to witness the disaster for marine life, and who are actively looking for ways to help save wildlife along

June 30, 2010

Feature - Distributed computing experts gather in Chicago

At OGF29, Mark Morgan demonstrates XCG, a production grid maintained by the Genesis II team at the University of Virginia.
Image courtesy Miriam Boon, iSGTW.

Last week, distributed computing experts from around the world converged on Chicago to attend OGF29 and the ACM International Symposium on High Performance Distributed Computing 2010 (HPDC).
“It’s been a quite successful HPDC,” said Peter Dinda, the event’s program chair. “In terms of the grid context, it’s good to have it co-located with Open Grid Forum.”
Both events benefited from the co-location, reporting increases in attendance.
Although the conference chairs did not actively seek out programming on the topic of clouds, this emerging technology became a major theme throughout both conferences.
OGF29’s second day featured a status update on the Open Cloud Computing Interface, and cloud technology demonstrations by t

June 23, 2010

Announcement - NSF announces Smart Health and Wellbeing program

The National Science Foundation announced on 11 June 2010 a new cross-cutting program called “Smart Health and Wellbeing,” for which they are accepting proposals.
Jeannette Wing, Assistant Director for NSF/CISE, wrote in a blog post, “We are looking for your great ideas for how advances in computer and information science and engineering can transform the nature and conduct of healthcare and wellness as we know it today.”
For more information, please visit the program solicitation.

June 23, 2010

Feature - Cancer researchers speed crystallography

Scientists have trained a system to recognize the formation of 3-D protein crystals, automating a time-intensive, manual process necessary for scrutinizing the structure of cancer-related proteins.
Image courtesy of IBM and the World Community Grid.

Using the World Community Grid, scientists at the Help Conquer Cancer Project have found a way to automate and speed up protein crystallography, according to a recent paper in the Journal of Structural and Functional Genomics.
X-ray crystallography is the process of using x-rays to map the structure of crystals. Although biological molecules such as proteins and DNA are not normally crystalline in form, they can be prompted to form crystals through exposure to the right chemical compounds. Once crystallized, the scientists can use x-rays to map the protein; knowing the structure of a protein is invaluable to scientists who are trying to understand how a protein interacts with the human

June 23, 2010

Video of the week - When your visualization wall isn't enough

Getting a good sense of your data on a mere large HD screen isn’t easy. And, sure, your friendly neighborhood visualization wall is an improvement, at about 13 million pixels projected across 84 square feet.
But wouldn’t it be better to stand in the center of a 2800 square foot sphere, interacting with your data in three dimensions at a resolution of 24 million pixels? Not to mention listening to audio cues in surround sound, projected from 500 individual speakers plus subwoofers.
Welcome to the AlloSphere, the visualization jewel of the California NanoSystems Institute at the University of California, Santa Barbara. Construction on the facility was completed in 2007, but the team behind the AlloSphere has continued to roll out additional features since then.
In this week’s video JoAnn Kuchera-Morin, the woman behind the AlloSphere, speaks to the audience at TED 2009 about five different research projects that

June 16, 2010

#photos { margin:0 auto !important; } .panel { background: #666 !important; text-align: center; } .panel img { margin: 0 auto; } .panel-overlay { height:20px; } .panel-overlay p { padding:10px 5px 10px 0; text-align:left; font-size:12pt; } --> $(document).ready(function(){ $('#photos').galleryView({ panel_width: 500, panel_height: 640, frame_width: 100, frame_height: 75, transition_speed: 800, transition_interval: 10000, background_color: '#000', pause_on_hover: true, filmstrip_position: 'bottom', overlay_position: 'bottom', overlay_height: 40 }); }); Image of the Week - A child’s-eye view of physicists Image 1 of 10 Image 2 of 10 Image 3 of 10

June 16, 2010

Back to Basics – Q&A on meshes

The geometry and adapted mesh for a patient-specific abdominal aorta aneurysm. Image courtesy of Min Zhou, Onkar Sahni, H. Jin Kim, C. Alberto Figueroa, Charles A. Taylor, Mark S. Shephard and Kenneth E. Jansen.

The word “mesh” seems to pop up all over the place. What exactly is a mesh? How about an adaptive mesh? Who uses them, and for what? This week, iSGTW interviewed Mark Shephard to learn more about meshes. Mark Shephard is the director of the Scientific Computation Research Center (SCOREC) at Rensselaer Polytechnic Institute. His primary area of interest is the development of techniques to reliably automate multiscale simulations..
iSGTW: So, what is a mesh? When do you use these?
Shephard: In the context of modeling physical behavior over complex geometric domains that can be modeled by partial differential equations (PDEs), meshes are used to support the application of methods that can provide approximate but accurate sol

June 9, 2010

caBIG® 2010 call for abstracts extended; registration now open

Attendees at the 2010 caBIG® Annual Meeting to be held September 13-15 at the Marriott Wardman Park Hotel in Washington, D.C., will be immersed in the latest developments and applications of caBIG® technology and will be presented with new opportunities for participation. 
The meeting will feature:

Guidance to help achieve interoperability—including adaptation of existing systems, adoption of industry-recognized standards and common data elements, or development and deployment of services and tools that are interoperable with caBIG®
Case studies and lessons learned from peers who have deployed and developed caBIG® tools and infrastructure
Access to resources that will help you achieve your goals for data integration, exchange, and collaboration, including hands-on and instructor-led educational sessions on caBIG® tools and infrastructure
Descriptions of caBIG®-en

June 9, 2010

Grace Hopper conference registration opens

Registration is now open for the 10th annual Grace Hopper Celebration of Women in Computing conference, which will be held 28 September – 2 October 2010 at the Hyatt Regency in Atlanta, Georgia. The full program for the event is also now available.
The world's largest gathering of women in computing in industry, academia, and government, the Grace Hopper Celebration of Women in Computing (GHC) is now a five day technical conference designed to bring the research and career interests of women in computing to the forefront. Over 600 speakers will include leading researchers and industry experts discussing their current work, while special sessions focus on the role of women in today's technology fields, including computer science, information technology, research, and engineering. Co-presented by the Anita Borg Institute and the Association for Computing Machinery (ACM), the conference has expanded this year to feature more than 110 session

June 9, 2010

BP oil spill: Scientists mobilize to create new disaster response science

The Gulf's wildlife is increasingly being affected by the spill. Image courtesy of NOAA.

Less than two weeks after the BP Deepwater Horizon oil rig explosion killed 11 and began leaking between two and four million liters of oil per day, the calls started coming in. The oil would soon reach the Louisiana coast, where it would do untold amounts of damage to the local marshes, wetlands, and channels. Could the team that successfully modeled hurricane storm surges along the Louisiana, Mississippi, and Texas coastlines help?
“We started working on the project fairly quickly, probably around the 10th of May,” said Clint Dawson, head of the Computational Hydraulics Group at the Institute for Computational Engineering and Sciences at The University of Texas at Austin.
With the highly accurate descriptions of the Gulf of Mexico’s coastline Dawson and his colleagues previously used for hurricane si

June 9, 2010

Feature - Seeing particles with VPM

VPM Interview from Renaissance Computing Institute on Vimeo.

Before we can make use of data, we need to make sense of it. But with complex concepts such as particulate air pollution, you could just as easily drown in the data.
And that’s exactly what was happening when NASA first approached Uma Shankar, an atmospheric scientist at the Institute for the Environment at UNC Chapel Hill, to ask what sort of advanced visualizations the particulate matter research community needed.
After some thought, Shankar suggested an application to visualize particulate matter across the range of sizes in which it occurs.
“Particulate matter has such important impacts on a variety of air quality issues, especially human health” Shankar explained. “Now we better understand the connection between particulate matter and climate, so its importance is even greater than we originally understood.”
Despite this better understanding, the existing visu

June 9, 2010

 

Link of the Week - If it were my home

When centered on West Hartford, Connecticut, the oil spill stretches over seven U.S. states.
Image courtesy Miriam Boon, taken as a screenshot on 4 June 2010.

It isn’t easy to get perspective on a major disaster such as the BP-Deepwater Horizon oil spill.
But this week’s link of the week, “If it was my home,” can at least help us to understand the scale of the oil spill.
The website automatically detects your location, then superimposes an approximate image of the oil spill, based on NOAA data, centered on your location. On the website you can move the oil spill around by entering other locations, or by clicking the button, “Put it back in the gulf.”
By seeing how much familiar ground the oil spill would cover, you can get a better sense of the spill’s physical size.
Of course, the information displayed on this map is just the tip of the, ah, oil; much of the spill remains underwater.
It is

June 9, 2010

Video of the Week - Simulations show scenarios for oil spill

This animation shows one scenario of how oil released at the location of the Deepwater Horizon disaster in the Gulf of Mexico may move in the upper 65 feet of the ocean. This is not a forecast, but rather, it illustrates a likely dispersal pathway of the oil for roughly four months following the spill. It assumes oil spilling continuously from April 20 to June 20. The colors represent a dilution factor ranging from red (most concentrated) to beige (most diluted). The dilution factor does not attempt to estimate the actual barrels of oil at any spot; rather, it depicts how much of the total oil from the source will be carried elsewhere by ocean currents. For example, areas showing a dilution factor of 0.01 would have one-hundredth the concentration of oil present at the spill site.
The animation is based on a computer model simulation, using a virtual dye, that assumes weather and current conditions similar to