Share |

Content about Feature

December 15, 2010

A new grid application may help biologists solve the structures of mystery proteins.

December 15, 2010

Imagine living next to a busy highway operating 24 hours a day for 365 days per year. That’s what life is like for ocean animals living next to busy shipping lanes.

December 8, 2010

Commentators suggest that Wikipedia and other collaborative network-driven projects, such as the Linux computer operating system, could be an emerging socioeconomic paradigm no less radical and disruptive than the Industrial Revolution.

Wikipedia is meant to be a new way of doing things in a world of ubiquitous electronic information and social networks, one that may be changing the conduct of everything from scientific research to political campaigns.

Sociological commentary and predictions aside, however, do Wikipedia and other “crowd-sourced” efforts really function so differently? Purdue communications researcher Sorin Adam Matei and his team are testing the concept by analyzing Wikipedia articles and revisions produced between 2001 and 2008 – a computationally demanding task. They’re finding that Wikipedia may have more in common with an old-fashioned factory floor than we think.

In theory, the collaborative online encyclopedia’s entries are created, edited and maintained with little in the way of traditional, hierarchical organizational and leadership structures. The production of Wikipedia has been characterized as an emergent system, like an ant colony, resulting from collective actions of individual participants with the “wisdom of the crowd” yielding a viable outcome.

December 8, 2010

iSGTW reader Harvey Newman gives his perspective on SC10.

December 1, 2010

Read about how the EpiCollect application can help field researchers gather data.

December 1, 2010

New computing tools could save graduate students from thousands of hours spent visually inspecting historical maps, quilts, medieval art, and manuscripts.

Humanities researchers use laborious visual inspection of historical materials to determine authorship and artistic lineage for manuscripts, maps, and quilts. A typical research project could take the time of several graduate students two or more years.

“We are interested in understanding how to support these domain scientists so they are more efficient,” said Peter Bajcsy, principal investigator for the Digging into Image Data to Answer Authorship-Related Questions (DID-ARQ) project at the National Center for Supercomputing Applications. He added, “We are giving them tools that will automate the visual inspection process.”

Those tools are the product of extensive collaboration between humanities researchers and Bajcsy’s team. To create them, the computer experts engaged in lengthy discussions to determine what questions the humanities researchers seek to answer, and exactly how they would answer them using traditional methods. Their inquiries sought out details such as which image features researchers use – consciously or unconsciously – to identify the author or artist. Then the computer scientists developed algorithms that could rapidly identify those features automatically.

December 1, 2010

At a recent workshop, physicists from several LHC experiments compared results.

November 17, 2010

Feature - Life at the extreme at the Pierre Auger Observatory The Pierre Auger Observatory has a detection area of 3,000 km², so large that it is best seen by airplane. A space-based sucessor with a detection area hundreds of times greater is already being planned: the JEM-EUSO will be attached to the International Space Station in 2013. It will use large volumes of the earth’s atmosphere to detect and observe particles colliding with planet’s magnetic field. All images courtesy Pierre Auger Observatory Some people enjoy living life at the edge, such as participants in extreme sports. At the other extreme are those who relish watching rare events.Among the latter are astronomers at the Pierre Auger Observatory, a multi-national collaboration to detect the 'light-signature' given off as these cosmic rays hit particles in our atmosphere. Based in Argentina, the observatory monitors ultra-high energy cosmic rays —  spectacular examples of some of nature

November 17, 2010

Feature - The 1970s in the 21st century: synthesized music returns (via parallel processing)

This Arp 2500 analog modular synthesizer from 1971 had hundreds of inputs and outputs for control of the synthesis processes. Image courtesy of

Curtis Roads is a professor, vice chair, and graduate advisor in media arts and technology, with a joint appointment in music at the University of California at Santa Barbara. He also was the editor of the Computer Music Journal (published by MIT Press), and co-founded the International Computer Music Association. He is often a featured speaker at conferences such as Supercomputing. 
Music is an interactive, concurrent process. A note or a chord sounds, then is replaced, gradually or sharply, softly or powerfully, by the next one. For electronically produced or enhanced music, real-time technical advances are critical to continued progress and exploration. In the 1970s, I fondly remember learning my first parallel

November 10, 2010

Feature - Can a digital earth save the planet?

(Thumbnail is in Image Manager at Iceland_Volcano_S.jpg)
Ash spewing from Iceland’s volcano, “Eyjafjallajoekull’ in 2010, in an image from the European Space Agency’s Envisat satellite. Image courtesy ESA (European Space Agency).

With climate change hot on the agenda, activists, scientists and politicians are looking into what can be done to provide a united front against this global issue.
At the 8th e-Infrastructure Concertation Meeting, held at CERN last Thursday and  Friday, a networking event organized by the European Commission (EC), one such project aims to consolidate the various Earth sciences. Their work could reduce the loss of life and property due to natural disasters, and help us better understand how our planet’s climate is changing.
Ground European Network for Earth Science Interoperations - Digital Earth Community (GENESI-DEC) is focused on providing a virtual resource for scientist

November 10, 2010

Feature - Reaching for sky computing

Photo copyright Tom Raven, CC 2.0

Sometimes, a single cloud isn’t enough. Sometimes, you need the whole sky.
That’s why a number of researchers are developing tools to federate clouds, an architectural concept dubbed “sky computing” in a paper published in the September/October 2009 issue of IEEE Internet Computing (PDF).
“Sky computing is a tongue in cheek term for what happens when you have multiple clouds,” explained Kate Keahey of the University of Chicago, who co-authored the paper alongside University of Florida researchers Mauricio Tsugawa, Andréa Matsunaga, and José Fortes.
“[In the paper] we talked about standards and cloud markets and various mechanics that might lead to sky computing over multiple clouds, and then that idea was picked up by many projects,” Keahey added.
Among those inspired by the concept was Pierre Riteau, a doctoral student in the computer science departmen

November 10, 2010

Feature - Scientific computing rock stars unveiled We asked you what makes someone a rock star of scientific computing, and you answered. Click on the image for a larger version. When last we polled our readers, we asked you who you think is a rock star of scientific computing. There were many names nominated, including Robert Grossman, director of the National Center for Data Mining, and Malcolm Atkinson, director of the e-Science Institute and the National e-Science Centre in the United Kingdom. Not all of our nominees were available to comment. Nonetheless, we did get three fantastic responses to our wacky rock star questionnaire. Read on to find out about where fame and computing meet! Ian Foster Director of the Computation Institute at Argonne National Laboratory Q: Let's start with the shameless plug part: What are you working on right now, why should your average user or developer care, and why is it super cool and challenging? A: Let

November 3, 2010


Project Profile - From grids to clouds and beyond: GRNET supports Greek researchers

The Acropolis from Philipapou Hill at sunset, Image courtesy Tim Rogers, stock.xchng

All Greek universities get their internet from one source: GRNET (Greek Research and Education Network), a company supported by the Greek state, which connects them both to each other and to the larger pan-European academic network, GÉANT.
GRNET’s mission is to get universities on line, to provide computing power and storage, and to develop services for researchers. Not the least of which is providing technical know-how and supporting schools and universities in Greece. “GRNET is actually a human network — this is the most important thing about it,” says Kostas Koumantaros, member of GRNET in Athens. “We transfer know-how between universities throughout Greece. It is a good vehicle to both promote research in Greece and for us to learn from our international collabora

November 3, 2010

Feature - LHC open to all

An actual recorded event from the Compact Muon Solenoid experiment—this event shows radiation and charged particles spilling into the detector from the beam colliding with material in the beam pipe.
Image courtesy Carl Lundstedt

Occasionally, iSGTW runs across stories in other publications related to the fields we cover. Below is an excerpt from Linux Journal, containing one person’s view of the whole process.
One of the items at the heart of the Large Hadron Collider (LHC) experiments is open-source software. The following will provide a glimpse into how scientific computing embraces open-source software and its open-source philosophy.
The LHC at CERN near Geneva, Switzerland, is nearly 100 meters underground and produces the highest-energy subatomic particle beams on Earth. The Compact Muon Solenoid experiment is one of the many collider experiments within the LHC. One of its goals is to give physicists a window into the universe fractions

November 3, 2010

Feature - Ultra-fast networks: The Final Frontier A network researcher in awe of the billions of dark matter particles simulated on 15 ultra-high definition monitors. Image courtesy Freek Dijkstra Researchers from Holland have demonstrated a network infrastructure that could potentially help scientists save time and even transform the movie business. This could be done without the need for large computer clusters or grids, just off-the-shelf hardware components combined with human ingenuity and one of the world’s fastest research networks. The team were from SARA, a Dutch supercomputing and e-science support center. Threshold The SARA researchers wanted to show the practicalities of streaming video between two institutions (from SARA, Amsterdam to CERN, Geneva) at 40Gb/second (5GB/s). This link, if successful, would be 16 times faster than the TEIN3 network, which streamed Malaysian dancers over 9,000 kilometers away to a live orchestra performance in Stockholm at 2.5 Gb/s. Th

October 27, 2010

DEISA and TeraGrid host joint EU/US Summer School in Italy

Attendees outside the  Santa Tecla Palace on Sicily’s southeastern shore. Image courtesy Summer School

The Santa Tecla Palace on Sicily’s southeastern shore was recently a classroom for a summer school dedicated to fostering collaboration and innovation in computational science among graduate and postdoctoral scholars from Europe and the United States.
A joint effort of the EU’s DEISA and America’s TeraGrid, it provided a multicultural student community the opportunity to learn about high performance computing (HPC) resources, tools and methods.
“We hope to continue with such events every year — alternating between EU and US destinations,” said Hermann Lederer, who presented a DEISA infrastructure and service overview.
Sixty graduate and postdoctoral scholars from 20 nations were selected from more than 100 applications. Participant expenses were paid by DEISA and TeraGrid. &

October 27, 2010

Feature - Rethinking scientific data management

The reading room of the George Peabody Library at Johns Hopkins University.
Image courtesy Leafar / Raphael Labbe, CC 2.0.

Despite all the good that science has wrought over the years, the way we manage scientific data is fundamentally flawed.
Sir Isaac Newton once said, “If I have seen further it is only by standing on the shoulders of giants.” Scientists stand on the shoulders of their peers and predecessors via peer-reviewed literature. The idea is that the literature is reliable by dint of being peer-reviewed, and thus researchers can safely build upon what they learn from it. Yet neither the reviewers who admitted those papers into the annals of scientific canon nor the scientists who wish to build upon it have access to the data used to produce those papers.
That means that they cannot ensure that they stand on solid ground by examining the data and doing their own analysis. They cannot analyze the data using altern

October 27, 2010

Feature - Theorists find dark matter evidence in open data

A visualization of the Fermi Gamma-ray Space Telescope.
Image courtesy of NASA and General Dynamics.

Dan Hooper and Lisa Goodenough are not part of the Fermi Gamma-ray Space Telescope collaboration. But by using FGST’s publicly released data, they were able to find clues to some of the universe’s juiciest secrets at the center of the Milky Way.
In their analysis, Hooper, a Fermilab theorist, and Goodenough, a graduate student at New York University, report that very-high-energy gamma rays coming from the center of the Milky Way originate from dark-matter collisions.
“We went out of our way to consider all causes of backgrounds that mimic the signal, and we found no other plausible astrophysics sources or mechanics that can produce a signal like this,” Hooper said.
A recent paper, published on the pre-print server arXiv, outlines their findings.
Astrophysicists have long postulated a wide range of

October 20, 2010

Feature - Climate model tackles clouds

flowplayer("player", "");

Animation from the NICAM model simulation of 21 May - 31 August 2009, showing cloudiness (based on outgoing long-wave radiation) in shades of gray and precipitation rate in rainbow colors, based on hourly data from the simulation. The cloudiness is shaded in brighter gray for thicker clouds, and the colors range from shades of green, indicating precipitation rate less than 1 mm/day, to yellow and orange (1 - 16 mm/day), to red (16-64 mm/day) and magenta (> 64 mm/day). The animation begins zoomed in over India and the Bay of Bengal, showing the fact that tropical cyclone Aila, which in reality made landfall near Calcutta killing dozens of Indian and Bangladeshi citizens and displacing over 100,000 people from their homes, was very accurately predicted in the simulation.
Video and caption courtesy NICS

Few areas of science are currently hotter than clima

October 20, 2010

Feature - New physics in space

A C5 Supergalaxy, one of the world’s largest planes, loading the AMS-02 experiment at Geneva Airport. Image courtesy CERN Bulletin

New life was breathed into the International Space Station (ISS) this year after NASA announced it will extend the ISS from 2015 to at least 2020.The new deadline extends opportunities for science experimentation in the largest space research laboratory ever constructed. One of these experiments is the Alpha Magnetic Spectrometer (AMS-02), a detector that may help scientists understand why our universe exists and why there is more matter than anti-matter.Most space-grade electronics are about ten years old, so the AMS-02 represents the newest and most advanced physics experiment in outer space to date. Currently, it is being tested and due to launch in February 2011. AMS-02 was shipped via Geneva airport to NASA this August in one of the largest planes in the world, a US Air Force C5 Super Galaxy.Once aboard the ISS, A

October 20, 2010

Profile – Domenico Vicinanza, master of fusion Musicians play ancient instruments live in Stockholm while dancers in Kuala Lumpur about 10,000 kilometers away simultaneously perform on the display above the stage. (Click on image above to see video of entire performance.) All images courtesy Domenico Vicinanza Domenico Vicinanza combines the worlds of science and music by using his talents as an engineer and a musician to bring ancient musical instruments back to life. In December 2009 Vicinanza and the 'Lost Sounds Orchestra' gave a unique performance. While playing ancient Greek music live in Stockholm on a virtual instrument, an ultra-fast, high-quality video-feed of dancers from Kuala Lumpur was displayed — simultaneously bringing two distant cultures and locations into one place. iSGTW caught up with Vicinanza for an interview.   iSGTW: What’s your job? Vicinanza: At DANTE I support international projects that use the GÉANT network, the pa

October 13, 2010

Feature - Astronomical computing
Supporting the Large Synoptic Survey Telescope means thinking big

A visualization of the LSST.
Image credit: Todd Mason, Mason Productions Inc./LSST Corporation

The Large Synoptic Survey Telescope to be constructed in Chile will incorporate the world’s largest digital camera, capable of recording highly detailed data more quickly than any other telescope of comparable resolution.
For the scientists working on the project, that all amounts to an exciting opportunity to learn more about moving objects (including monitoring asteroids near the Earth), transients such as the brief conflagrations of supernovae, dark energy, and the structure of the galaxy.
For computing specialists, it means more data. A lot more data.
The LSST will take between 1000 and 2000 panoramic 3.2 gigapixel images per night, covering its hemisphere of the sky twice weekly. Along with daytime calibration images, this will amount to 20 terabytes of data stored every 2

October 13, 2010

Feature - Computational chemistry suite goes open source

An image generated using NWChem.
Image courtesy Eric Bylaska, EMSL.

A widely-used suite of computational chemistry software, NWChem, was released as open source on 29 September 2010.
The suite, which has already been used on conventional workstations clusters and high-performance supercomputers, has already been downloaded over 500 times.
Currently, the NWChem core development team is working on porting the code to work with GPUs and in cloud computing environments.
“It’s a very comprehensive software suite and covers almost all the theoretical approaches currently being used by computational chemists and materials scientists who use first-principles quantum mechanical calculations in their research,” said Niri Govind, a member of the NWChem core development team based at the Environmental Molecular Sciences Laboratory at Pacific Northwest National Laboratory.
NWChem first came on the scene in the mid-199

October 13, 2010

Feature - Data is big news

Members of the expert group, as well as commission officers; John Wood is standing in the center holding the report.
Previous page: “Atomium” sculpture, from the 1958 World's Fair in Brussels. Images courtesy e-ScienceTalk.

What will data infrastructures look like 20 years from now?To find out, the European Commission (EC) assembled  a panel of experts to prepare a vision of scientific data e-infrastructures in 2030. The resulting report, overseen by chair John Wood of Imperial College London, was released last week in Brussels.“Data is big news,” said Wood. In the past, date might have been thrown away but it’s now being kept and recorded as emails, videos, mobile phone data and more. CERN produces petabytes of data each year, but with genome sequencing, electronic health records and upcoming experiments such as the Square Kilometer Array, we’re on schedule to generate hundreds of times more. In the words of Wood:

October 6, 2010

Feature - A lasting ocean observatory

A map indicates the location of the four major ocean arrays, as well as the two minor ones. Click for a larger version. Image courtesy of OOI - CEV at University of Washington.

Agile architecture is essential if a large-scale infrastructure like the Ocean Observatories Initiative is to last three decades, as mandated.
“The Ocean Observatory has been in planning for fifteen years and more,” said Matthew Arrott, OOI’s project manager for cyberinfrastructure. “It is our anticipation, over a 30 year lifespan, that we need to account for user needs and the technology that we are using all changing.”
That’s why they’ve focused their attention on creating an infrastructure that can interface with a wide variety of software packages and computational resource providers.
“The observatory supports a broad range of analysis with the expectation that the majority of the analysis capability will be provided a