Share |

Content about Americas

December 15, 2010

A new grid application may help biologists solve the structures of mystery proteins.

December 15, 2010

Original courtesy David Alan Grier

It may look like just another Yuletide scene in an office in 1958, but this one is something special.

December 8, 2010

Commentators suggest that Wikipedia and other collaborative network-driven projects, such as the Linux computer operating system, could be an emerging socioeconomic paradigm no less radical and disruptive than the Industrial Revolution.

Wikipedia is meant to be a new way of doing things in a world of ubiquitous electronic information and social networks, one that may be changing the conduct of everything from scientific research to political campaigns.

Sociological commentary and predictions aside, however, do Wikipedia and other “crowd-sourced” efforts really function so differently? Purdue communications researcher Sorin Adam Matei and his team are testing the concept by analyzing Wikipedia articles and revisions produced between 2001 and 2008 – a computationally demanding task. They’re finding that Wikipedia may have more in common with an old-fashioned factory floor than we think.

In theory, the collaborative online encyclopedia’s entries are created, edited and maintained with little in the way of traditional, hierarchical organizational and leadership structures. The production of Wikipedia has been characterized as an emergent system, like an ant colony, resulting from collective actions of individual participants with the “wisdom of the crowd” yielding a viable outcome.

December 8, 2010

Were you wishing you could have been at CloudCom 2010 last week?

We can’t transport you back in time, but thanks to the IEEE Computer Society, you do have the opportunity to watch many of the presentations you missed.

December 1, 2010

New computing tools could save graduate students from thousands of hours spent visually inspecting historical maps, quilts, medieval art, and manuscripts.

Humanities researchers use laborious visual inspection of historical materials to determine authorship and artistic lineage for manuscripts, maps, and quilts. A typical research project could take the time of several graduate students two or more years.

“We are interested in understanding how to support these domain scientists so they are more efficient,” said Peter Bajcsy, principal investigator for the Digging into Image Data to Answer Authorship-Related Questions (DID-ARQ) project at the National Center for Supercomputing Applications. He added, “We are giving them tools that will automate the visual inspection process.”

Those tools are the product of extensive collaboration between humanities researchers and Bajcsy’s team. To create them, the computer experts engaged in lengthy discussions to determine what questions the humanities researchers seek to answer, and exactly how they would answer them using traditional methods. Their inquiries sought out details such as which image features researchers use – consciously or unconsciously – to identify the author or artist. Then the computer scientists developed algorithms that could rapidly identify those features automatically.

December 1, 2010

At a recent workshop, physicists from several LHC experiments compared results.

December 1, 2010

Some kids can make anything cute, and these ones are no exception.

November 17, 2010

Announcement - Last chance, Data Center Infrastructure Management, Nov. 16-18, MERIT network

Photo courtesy Merit

The three-day "Data Center Infrastructure Management" online learning class will be available November 16-18 through Merit's Professional Learning program. Merit is a nonprofit corporation, owned and governed by Michigan's public universities, providing high-performance networking solutions to public universities, colleges, K-12 organizations, libraries, state government, healthcare, and other non-profit organizations.
Instruction will be entirely online — you can attend from anywhere where there is an Internet connection. You can also attend at Merit's offices if you need a space away from your daily demands. The course will provide knowledge about the professional management of data facilities, which is increasingly important for organizations of all types and sizes. It is of interest to individuals who manage data centers or server rooms, IT staff with hardw

November 17, 2010

Feature - Life at the extreme at the Pierre Auger Observatory The Pierre Auger Observatory has a detection area of 3,000 km², so large that it is best seen by airplane. A space-based sucessor with a detection area hundreds of times greater is already being planned: the JEM-EUSO will be attached to the International Space Station in 2013. It will use large volumes of the earth’s atmosphere to detect and observe particles colliding with planet’s magnetic field. All images courtesy Pierre Auger Observatory Some people enjoy living life at the edge, such as participants in extreme sports. At the other extreme are those who relish watching rare events.Among the latter are astronomers at the Pierre Auger Observatory, a multi-national collaboration to detect the 'light-signature' given off as these cosmic rays hit particles in our atmosphere. Based in Argentina, the observatory monitors ultra-high energy cosmic rays —  spectacular examples of some of nature

November 17, 2010

Feature - The 1970s in the 21st century: synthesized music returns (via parallel processing)

This Arp 2500 analog modular synthesizer from 1971 had hundreds of inputs and outputs for control of the synthesis processes. Image courtesy of

Curtis Roads is a professor, vice chair, and graduate advisor in media arts and technology, with a joint appointment in music at the University of California at Santa Barbara. He also was the editor of the Computer Music Journal (published by MIT Press), and co-founded the International Computer Music Association. He is often a featured speaker at conferences such as Supercomputing. 
Music is an interactive, concurrent process. A note or a chord sounds, then is replaced, gradually or sharply, softly or powerfully, by the next one. For electronically produced or enhanced music, real-time technical advances are critical to continued progress and exploration. In the 1970s, I fondly remember learning my first parallel

November 17, 2010

Image of the Week - Elegance of darkness When galaxies collide. Original courtesy Argonne National Laboratory What you see is the collision of two galaxies over billions of years, albeit virtually. As physicists at CERN investigate the smallest particles in the universe, US scientists are studying the behavior of the largest cosmic structures in existence. A team at the University of Chicago Flash Center and the Harvard-Smithsonian Center for Astrophysics used an Argonne National Laboratory Supercomputer to identify elusive dark matter. The researchers simulated the motion and collision of galactic clusters — some of the largest structures in the universe — to infer dark matter’s influence, as it cannot be observed directly. Dark matter greatly influences gas and galaxies over trillions of light years.  Furthermore, these collisions more accurately predict the interaction of both normal and dark matter (it is thought dark matter constitu

November 17, 2010

  Link of the Week: Coming to an i-Phone near you Image courtesy Flickr under Creative Commons licence Since the story in iSGTW last year about Cinefilia, the grid-enabled film recommendation service, it creator and sole webmaster, Leandro Ciuffo, says his user base has increased by 27% — without any direct promotion or advertising. Once a user has signed up for a Cinefilia account they can review whether they like or dislike one of hundreds of films on the database. The system then ‘learns’ that user’s preferences and generates personalized recommendations accordingly. (But in order for the results to be accurate a minimum of 20 films must be rated by a user.)   Ciuffo aims to increase the amount of Brazilian films on the database because 95% of users on his site are Brazilian, possibly because there are currently no recommendation systems for Brazilian films. Ciuffo is looking for partners to help him improve the recommendation software algo

November 10, 2010

Announcement - Cybera Summit videos available

Attendees network at Cybera Summit 2010. Photo courtesy Joni Evans.

Thank you to everyone who joined us in Banff for our annual fall event, Cybera Summit 2010: Driving Alberta's Digital Evolution, which brought a diverse group of innovation enthusiasts and trailblazers to explore emerging technologies and R&D initiatives in three hot-topic tracks:

Bridging the Digital Divide
Setting R&D Priorities
Building Collaborative Networks

For a quick Summit recap and impressions from those who took part in the event, videos of the event's keynote speakers, and slides of speaker presentations, click here.

November 10, 2010

Feature - Reaching for sky computing

Photo copyright Tom Raven, CC 2.0

Sometimes, a single cloud isn’t enough. Sometimes, you need the whole sky.
That’s why a number of researchers are developing tools to federate clouds, an architectural concept dubbed “sky computing” in a paper published in the September/October 2009 issue of IEEE Internet Computing (PDF).
“Sky computing is a tongue in cheek term for what happens when you have multiple clouds,” explained Kate Keahey of the University of Chicago, who co-authored the paper alongside University of Florida researchers Mauricio Tsugawa, Andréa Matsunaga, and José Fortes.
“[In the paper] we talked about standards and cloud markets and various mechanics that might lead to sky computing over multiple clouds, and then that idea was picked up by many projects,” Keahey added.
Among those inspired by the concept was Pierre Riteau, a doctoral student in the computer science departmen

November 3, 2010

Feature - LHC open to all

An actual recorded event from the Compact Muon Solenoid experiment—this event shows radiation and charged particles spilling into the detector from the beam colliding with material in the beam pipe.
Image courtesy Carl Lundstedt

Occasionally, iSGTW runs across stories in other publications related to the fields we cover. Below is an excerpt from Linux Journal, containing one person’s view of the whole process.
One of the items at the heart of the Large Hadron Collider (LHC) experiments is open-source software. The following will provide a glimpse into how scientific computing embraces open-source software and its open-source philosophy.
The LHC at CERN near Geneva, Switzerland, is nearly 100 meters underground and produces the highest-energy subatomic particle beams on Earth. The Compact Muon Solenoid experiment is one of the many collider experiments within the LHC. One of its goals is to give physicists a window into the universe fractions

October 27, 2010

Announcement - 2010 NWChem Workshop

A workshop on the computational chemistry package, NWChem, will take place 1-2 December 2010 at the National Center for Supercomputing Applications in Urbana, Illinois.
NWChem is a computational chemistry package that can be used to perform electronic structure calculations on molecular and periodic systems as well as classical molecular dynamics simulations. It is designed to run on high-performance parallel supercomputers as well as conventional workstation clusters. The aim of the program is to provide scalable solutions for large-scale atomistic simulations. It has been ported to almost all high-performance computing platforms, workstations, PCs running Linux, as well as clusters of desktop platforms or workgroup servers.
NWChem includes a range of capabilites including: Hartree Fock, Density Functional Theory (including most of the state-of-the-art exchange-correlation functionals), higher order many body approaches like Coupled Cluster Theory and MP2, relativisti

October 27, 2010

Announcement - UP 2010 Cloud Computing Conference

Photo courtesy UP2010

October 27, 2010

Feature - Rethinking scientific data management

The reading room of the George Peabody Library at Johns Hopkins University.
Image courtesy Leafar / Raphael Labbe, CC 2.0.

Despite all the good that science has wrought over the years, the way we manage scientific data is fundamentally flawed.
Sir Isaac Newton once said, “If I have seen further it is only by standing on the shoulders of giants.” Scientists stand on the shoulders of their peers and predecessors via peer-reviewed literature. The idea is that the literature is reliable by dint of being peer-reviewed, and thus researchers can safely build upon what they learn from it. Yet neither the reviewers who admitted those papers into the annals of scientific canon nor the scientists who wish to build upon it have access to the data used to produce those papers.
That means that they cannot ensure that they stand on solid ground by examining the data and doing their own analysis. They cannot analyze the data using altern

October 27, 2010

Feature - Theorists find dark matter evidence in open data

A visualization of the Fermi Gamma-ray Space Telescope.
Image courtesy of NASA and General Dynamics.

Dan Hooper and Lisa Goodenough are not part of the Fermi Gamma-ray Space Telescope collaboration. But by using FGST’s publicly released data, they were able to find clues to some of the universe’s juiciest secrets at the center of the Milky Way.
In their analysis, Hooper, a Fermilab theorist, and Goodenough, a graduate student at New York University, report that very-high-energy gamma rays coming from the center of the Milky Way originate from dark-matter collisions.
“We went out of our way to consider all causes of backgrounds that mimic the signal, and we found no other plausible astrophysics sources or mechanics that can produce a signal like this,” Hooper said.
A recent paper, published on the pre-print server arXiv, outlines their findings.
Astrophysicists have long postulated a wide range of

October 27, 2010


Link of the Week - rolls out IaaS

Image courtesy of

Over a year ago the US General Services Administration launched, an online store where government agencies and bodies can shop for and purchase software as a service.
Now, the GSA has announced that will soon provide access to cloud storage, virtual machines, and other forms of Infrastructure as a Service. To learn more, read the original press release at our link of the week.
—Miriam Boon, iSGTW

October 20, 2010

Announcement - CERN Latin-American School of High Energy Physics, Natal, Brazil, 23-25 April 2011

Photo courtesy CERN

The CERN Latin-American School of High-Energy Physics is encouraging experimental high-energy physics students, who are also in the final years of their PhDs, to apply. Masters and post-doctoral students are also welcome to the course. There are a limited number of places so an early application is advisable. Please be aware that prior knowledge of high-energy physics is required in order to fully benefit from the programme.
The school is being organized jointly by the European Laboratory for Particle Physics (CERN), Geneva, Switzerland; CIEMAT, Research Organization of the Spanish Ministry of Education and Science and a team of local organizers from institutes in Brazil.
Successful applicants will be housed in the Hotel Porto do Mar, which provides conference facilities for lectures and discussion sessions. The hotel also has sports and leisure facilities that will be

October 20, 2010

Feature - Climate model tackles clouds

flowplayer("player", "");

Animation from the NICAM model simulation of 21 May - 31 August 2009, showing cloudiness (based on outgoing long-wave radiation) in shades of gray and precipitation rate in rainbow colors, based on hourly data from the simulation. The cloudiness is shaded in brighter gray for thicker clouds, and the colors range from shades of green, indicating precipitation rate less than 1 mm/day, to yellow and orange (1 - 16 mm/day), to red (16-64 mm/day) and magenta (> 64 mm/day). The animation begins zoomed in over India and the Bay of Bengal, showing the fact that tropical cyclone Aila, which in reality made landfall near Calcutta killing dozens of Indian and Bangladeshi citizens and displacing over 100,000 people from their homes, was very accurately predicted in the simulation.
Video and caption courtesy NICS

Few areas of science are currently hotter than clima

October 13, 2010

Feature - Astronomical computing
Supporting the Large Synoptic Survey Telescope means thinking big

A visualization of the LSST.
Image credit: Todd Mason, Mason Productions Inc./LSST Corporation

The Large Synoptic Survey Telescope to be constructed in Chile will incorporate the world’s largest digital camera, capable of recording highly detailed data more quickly than any other telescope of comparable resolution.
For the scientists working on the project, that all amounts to an exciting opportunity to learn more about moving objects (including monitoring asteroids near the Earth), transients such as the brief conflagrations of supernovae, dark energy, and the structure of the galaxy.
For computing specialists, it means more data. A lot more data.
The LSST will take between 1000 and 2000 panoramic 3.2 gigapixel images per night, covering its hemisphere of the sky twice weekly. Along with daytime calibration images, this will amount to 20 terabytes of data stored every 2

October 13, 2010

Feature - Computational chemistry suite goes open source

An image generated using NWChem.
Image courtesy Eric Bylaska, EMSL.

A widely-used suite of computational chemistry software, NWChem, was released as open source on 29 September 2010.
The suite, which has already been used on conventional workstations clusters and high-performance supercomputers, has already been downloaded over 500 times.
Currently, the NWChem core development team is working on porting the code to work with GPUs and in cloud computing environments.
“It’s a very comprehensive software suite and covers almost all the theoretical approaches currently being used by computational chemists and materials scientists who use first-principles quantum mechanical calculations in their research,” said Niri Govind, a member of the NWChem core development team based at the Environmental Molecular Sciences Laboratory at Pacific Northwest National Laboratory.
NWChem first came on the scene in the mid-199

October 13, 2010

Image of the week - A better supernova model

This image shows a 3D time series of the development and expansion of the supernova shock. Time is increasing as you move from left to right. The purple surface is an isocontour of entropy while the blue/green surface is an isocontour of density.
Image by Jason Nordhaus and Adam Burrows, Princeton University. Image and caption courtesy of NERSC.

When large stars die out and collapse, they explode, creating a supernova. But when scientists attempted to simulate this process, they got a “fizzle” instead of a “bang.” Until now, scientists simply assumed that there is something fundamental about the physics of supernovae that we didn't understand.
Now scientists may have cracked the problem by using a new approach to create computer simulations of supernovae.
“The new simulations are based on the idea that the collapsing star itself is not sphere-like, but distinctly asymmetrical and affected by a host of inst