Share |

Content about Feature

June 2, 2010

Feature - EMI, home to middleware

Photo courtesy EMI

Who will look after distributed computing middleware in the European Grid Infrastructure-era?
A new project is poised to become the answer to that question.
The goal of the European Middleware Initiative (EMI)l is to pull together the best middleware experts in Europe. Working together, they will improve and standardize the dominant existing services to produce simplified and interoperable middleware.
Specifically, the experts that it will work with come from ARC (a product of NorduGrid), gLite (EGEE’s product), Unicore and dCache.
EMI will be the major software provider for EGI, empowering the EGI infrastructure with more stable, useable and manageable software.
“The innovation in EMI is that for the first time the major middleware producers in Europe are working together in the same project to produce the actual middleware used by the major infrastructures, like EGI and PRACE,” says Alberto Di Meglio, EMI p

June 2, 2010

Profile: EGI’s director, Steven Newhouse

The new director of the organization that will coordinate Europe’s grid infrastructure, at the gala EGEE User Forum dinner in Uppsala, Sweden. Image courtesy Corentin Chevalier, GridTalk

Steven Newhouse was recently appointed director of EGI.eu, a new, long-term organization tasked with coordinating the European Grid Infrastructure. Headquartered in Amsterdam, it will support Europe’s National Grid Initiatives as they operate the infrastructure which was built by the projects DataGrid, EGEE-I, -II and -III. He will leave his post at CERN as EGEE technical director to head EGI.eu (and the EGI-InSPIRE project). Former iSGTW editor Danielle Venton sat down with him to ask a few questions.
iSGTW: How would you describe EGI.eu?
Newhouse: EGI.eu is a group of people dedicated to working with colleagues in National Grid Infrastructure across Europe, and making those resources available to a diverse set of user communities across t

May 26, 2010

Feature - ARGUS keeps a sure watch always

Upon the death of Argus, Hera honored her faithful watchman by gathering his many eyes and placing them on the tail of a peacock. Image courtesy Gari.Baldi, under Creative Commons license.

In classical Greek mythology, a multi-eyed, insomniac giant named “Argus” was employed by Hera to keep an eye on the doings of her husband Zeus — mightiest of gods.
Argus was extremely vigilant; an ancient poet wrote that “. . . sleep never fell upon his eyes; but he kept sure watch always.”
So it was appropriate that “ARGUS” became the name of a newly created authorization service to observe and protect Europe’s grid infrastructure. Overseen by the European Grid Infrastructure (which is in turn coordinated by the European Grid Initiative), ARGUS is designed to be a secure and efficient means of offering a single authorization and authentication point for multiple services.
ARGUS works in a series of s

May 26, 2010

Feature - New result could shed light on the existence of the Universe

The DZero collaboration has found evidence for a new effect which could explain the matter-antimatter asymmetry of nature. CP violations, such as this effect, are in disagreement with the predictions of the theoretical framework known as the Standard Model of particles and their interactions. The effect ultimately may help to explain why the universe is filled with matter while antimatter disappeared shortly after the Big Bang. Image courtesy of the DZero collaboration. 

The Big Bang should have created a universe with equal amounts of matter and anti-matter. Instead, we only see antiparticles when they are produced in nuclear reactions, cosmic rays, and particle colliders.
The dominance of matter that we observe in the universe is possible only if there is ‘CP violation’ – differences in the behavior of particles and antiparticles. Although physicists first observed certain forms of CP vio

May 26, 2010

Feature - Wireless grids: Squeezing a grid onto a widget

This diagram shows the layers from which WiGiT is composed.
Image courtesy WiGiT.

As wireless devices become increasingly common, and common devices become increasingly “smart,” wireless grids become increasingly practical. That means that the timing is perfect for WiGiT, a wireless grid testbed which will begin testing its alpha software in June.
The purpose of WiGiT (Wireless Grids innovation Testbed), according to the Syracuse University project leader Lee McKnight, is to refine open specifications for a wireless grid standard, and create a stable platform for experimentation.
“With WiGiT we expect to be able to do these large scale experiments from campus to campus, and we can run little experiments on that,” McKnight said. “Open specifications will make it easier for others to latch on.”
WiGiT, a National Science Foundation-Partners for Innovation program-funded collaboration between

May 19, 2010

Magellan explores scientific clouds – scientifically

The Magellan management and network control racks at NERSC. To test cloud computing for scientific capability, NERSC and the ALCF installed purpose-built test beds for running scientific applications on the IBM iDataplex cluster. Magellan systems at both NERSC and the ALCF will be built using QDR InfiniBand fabric. Image courtesy R. Kaltschmidt, LBNL

It’s clear that cloud computing is today’s hot computing buzzword. All the cool kids are doing it, and scientists are no exception. But how much do we really know about the intersection between science and cloud?
“Cloud computing has become a very exciting new field with several companies making offerings that are already being used by scientists around the world,” said Pete Beckman, director of Argonne Leadership Computing Facility and leader of the ALCF Magellan team. “The question the Department of Energy has is pretty straightforward: what kind

May 19, 2010

Feature - Secure encryption, with Hydra

The Hydra was so poisonous that even its tracks were deadly; it was eventually killed by Hercules as one of his Twelve Labors. Image courtesy Wikimedia Commons

Greek mythology tells of the Lernaean Hydra, a nine-headed serpent with poisonous breath and a unique defensive system: every time you cut off one head, two more grew in its place.
According to legend, the acted as an effective guardian; if it were real, you might want one to guard your most important belongings.
In the world of grid computing, the Hydra data encryption system does just that: it is designed to help protect sensitive data while being stored and transferred.
It may be especially useful in the worlds of medicine and finance, where data is often highly sensitive and security is paramount. (In contrast, while the high energy physics community produces and processes huge volumes of data, the information itself is not financially valuable or personally sensitive.)
Grids pro

May 19, 2010

Feature - Technology roundup: Science gateways and portals may level the playing field

Baseball diamond in a small town at night. Image courtesy Joe Y Jiang, Creative Commons

Our guest writer is Elizabeth Leake of TeraGrid — the high-performance, distributed computing network in the US. At the recent EGEE User Forum in Sweden, attendees had a chance to learn about two general-purpose portal engines and two domain-oriented portals. What these technologies have in common is the ability to act as “science gateways,” ultimately allowing new communities of researchers better access to advanced computing, thus leveling the playing field — a key part of “eScience.” Here is a round-up of some of the features pf these gateways, and who is developing them:VCR
Milan Prica from Sincrotrone Trieste, an independent laboratory in Italy, presented an advanced web portal with virtual collaboration features. The latest version of Virtual Control Room (VCR) inc

May 12, 2010

Feature - Collaboration matchmaking in VIVO An example of the sort of social network diagram that could be generated using VIVO data. This one maps co-authored papers. The line color denotes the first year that the connected researchers co-authored papers. The node color shows the number of times that author has been cited, and the node size indicates the number of papers that researcher has authored. To see a larger version of the image, including a list of some of the upcoming features in VIVO, please click on the image. Image courtesy of the project briefing given by Dean Krafft and Valerie Davis at the Spring 2010 Membership meeting of the Coalition for Networked Information. It’s tough to find a good match, but that doesn’t stop some from logging onto social networking sites in search of one. We all want to find that special someone who complements our strengths and shares our interests. That’s why Michael Conlon hopes scientists around the world will someda

May 12, 2010

Feature - The security-accessibility tug-o-war

Rajasthani women take part in tug of war game at Pushkar fair, in India's desert state of Rajasthan.
Image courtesy of Sumith Meher, CC BY-SA 2.0.

In the tug-o-war between security and ease of use, priorities can vary widely. But if there is a sweet spot, Mine Altunay is going to find it.
“We’re trying to understand how we can provide end-to-end infrastructure that is secure enough but easy enough to use,” said Altunay, who is Open Science Grid’s security officer.
Altunay began the process by running a joint OSG-ESnet workshop on identity management last November, where they sought input from users and a small number of resource providers.
“We wanted to touch bases with our user community and we wanted to understand how this process is working for the end user,” Altunay explained.
What they found is that the current process is too complicated and time consuming for end users. In order to sign int

May 12, 2010

SAFE-BioPharma - A new domain standard for secure identity

Mollie Shields-Uehling, pictured above, is the CEO of the SAFE-BioPharma Association. Image courtesy Mollie Shields-Uehling.

In medical and pharmaceutical research, researchers deal with sensitive private information on a daily basis. That makes secure identity management a crucial need. Mollie Shields-Uehling is the CEO of the SAFE-BioPharma Association, a non-profit organization charged with creating a standard that meets that need.
iSGTW: Thank you for joining us for this discussion, Mollie. Could you tell us a little bit about SAFE-BioPharma?
Shields-Uehling: SAFE-BioPharma Association is a non-profit industry collaboration established by the world's leading biopharmaceutical companies to develop and maintain a global interoperable digital identity and signature standard for the biopharmaceutical and healthcare communities. The purpose of the standard is to allow the transformation of business and regulatory process to

May 5, 2010

CLARIN: A project that speaks to you

Wee-Ta-Ra-Sha-Ro, Head Chief of the Wichita. Painted by George Catlin in 1834. Image courtesy Indigenouspeople.net

The creation story of the Wichita people tells of a creator, “Man-never-known-on-Earth,” who formed the world, land, water and the first man and woman: “Man-with-the-Power-to-Carry-Light” and “Bright-Shining-Woman.” This couple brought to the Earth light, corn-growing, deer-hunting, game-playing and prayer, before becoming the morning star and the moon. While the story itself is preserved in literature for antiquity (e.g., in George Dorsey’s 1904 book The Mythology of the Wichita), fewer than 10 people today can tell the story in the Wichita language, nearly all of whom are elders living on tribal lands in Oklahoma, USA. It’s a pattern repeated around the world; many languages are endangered or dying. Preserving these languages is vital for groups seeking to revitalize and maintain their

May 5, 2010

Feature - From EGEE to EGI: Plain talk with Bob Jones and Steven Newhouse At the Uppsala Gala Dinner, Bob Jones of EGEE handed over to Steven Newhouse of EGI his most prized possession — a crown made from all the name tags he collected from conferences in the past six years. Image courtesy GridTalk After six years, on 1 May, EGEE will hand over responsibility for the world’s largest grid infrastructure to a new organization dedicated to its coordination and development (EGI.eu), and its newly elected director, Steven Newhouse. During its lifetime, EGEE — Enabling Grids for E-SciencE — assembled a world-wide infrastructure of CPU cores, hosted by computing centers around the world. Each month, about 13 million jobs are executed on the EGEE Grid. This massive multi-disciplinary production infrastructure was led until now by Bob Jones who initially, like Steven, held the position of technical director at EGEE, and quickly advanced to project director. Du

May 5, 2010

Feature - Frontier guides computing through the collision landscape

Just like you might have trouble navigating using this antique map, detector experiments can’t make sense of their data using an out-of-date map of their detector. Image courtesy Boston Public Library’s Norman B. Leventhal Map Center under Creative Commons license

The colossal particle detectors that monitor collisions at the Tevatron in Illinois and the Large Hadron Collider in Switzerland are unique beasts.
Scientists design most of the parts inside them to meet an individual set of specifications. But every once in a while, they find something the detectors can share.
Scientists at the CMS and ATLAS experiments at CERN are using a software system that Fermilab’s Computing Division originally designed for the CDF experiment at the Tevatron. The system, called Frontier, helps scientists distribute at lightning speed information needed to interpret collision data. The system is based upon the wid

April 28, 2010

Back to Basics – Data management

by TRACEY WILSON
Tracey Wilson is a program manager for Avetec’s HPC Research Division, DICE, the Data Intensive Computing Environment. DICE is a non-profit program that serves HPC and IT data centers in commerce, government and academia by providing independent third-party evaluations of data management practices on its geographically distributed test bed.

From smartphones to weather forecasts, data drives our world.
In the simplest terms, data is information, which can be found in many forms. Back when computer scientists used punch cards to store data, keeping them in order was a way of controlling or managing data.
We used to talk about how incredibly large a gigabyte of data was. Now terabytes and petabytes are the norm in scientific circles, and we will soon be talking about exabytes just as casually. And as the amount of data we generate grows, so does our need for data management.
Data management — the effective control

April 28, 2010

Q&A: Peer-reviewed physics, at the speed of light

Sergio Bertolucci during an interview with a member of the Swedish press. Image courtesy Corentin Chevalier, GridTalk

Sergio Bertolucci is the director for research and computing at CERN. Over the noise of nearby cathedral bells chiming the hour, iSGTW caught up with him on the steps of the University Building in Uppsala during a coffee break at the EGEE User Forum. We asked him about the spate of new papers coming out from the LHC, and what it all means for science.iSGTW: We have heard that a lot of papers have already been published in the time since the start-up of the LHC. Is that right?Bertolucci: Four papers on high-energy physics have already published, and 15 are in preparation as of today, April 14, all based on the collisions that just happened.One week after the first collisions, the first papers were published electronically. And these were all peer-reviewed.
 
iSGTW: That’s very fast, compared to the some

April 21, 2010

Feature - Stem cell research goes Boolean with BooleanNet

Image courtesy of Rodolfo Clix.

To make use of the human genome in our quest to understand genetic disorders, we need to learn more about what each gene accomplishes. Unfortunately, connecting a specific gene to the formation of a specific cell can take years of hard work and thousands of dollars.
An algorithm that could cut that time down from years to hours has passed its first litmus test, however, according to a paper recently published in the Proceedings of the National Academy of Sciences.
The paper’s lead author, Debashis Sahoo, had his eureka moment during an immunology class. Sahoo, who was working on a doctorate in electrical engineering at the time, observed that although many biological relationships are asymmetrical, biologists tended nonetheless to look for symmetrical relationships. Sahoo and his advisors quickly realized that these asymmetrical relationships can be found using Boolean logic, such as if-

April 21, 2010

The rise of autonomic computing

The instrumented oil field, seen above, can be thought of as a “cyberecosytem,” in which feedback loops are use to optimize oil extraction. Image courtesy Manish Parashar

How do you maximize the amount of oil you can extract from an oil field? One way is to use the Instrumented Oil Field, or IOF, an application coordinated by the Center for Subsurface Modeling at the University of Texas and used by a consortium of academic and industry researchers. It uses a network of sensors embedded underground to monitor the state of the reservoir while the oil is being extracted.The IOF calculates deposits of oil which can be safely extracted and economically viable, while classifying other parts that cannot be reached as “bypassed oil.” However, if the model relies purely on fixed initial conditions, up to 60% can be inaccurately deemed unreachable.One way around this problem is to use autonomic computing, whose goal is to build systems a

April 21, 2010

Opinion: Africa Grid?

Official ribbon-cutting for “Blue Gene for Africa” last year. This supercomputer is the fastest scientific computer on the African continent, capable of 11.5 teraflops (11.5 trillion floating point operations per second). Image courtesy Center for High-Performance Computing

At the EGEE User Forum in Uppsala, the author, Bruce Becker of the Meraka Institute and coordinator of the South African National Grid, called for making an AfricaGrid a reality. Here he outlines the reasons why now is the opportune time for work on this to be starting in earnest.
For some years now, many have been hinting at an “AfricaGrid.”
In the Mediterranean basin, we have seen many African countries participating directly in EUMedGrid (and more recently EUMedSupport).
In the southern region of Africa, we have seen much activity over the last couple of years that allows to envisage at least a “Sub-Saharan Grid.”
This prospect is very appealing to the r

April 14, 2010

Feature - Mean Shift Smoothie interprets medical images 66 times faster

One of the very first X-rays, taken by William Röentgen in 1896, and showing his wife’s hand. What is the dark blob on one finger?* Image courtesy Wikipedia
*Answer at very bottom of this article.

As any lay-person who has ever looked at an X-ray knows, it can be very difficult to tell what you are looking at, let alone differentiate what is healthy from what is diseased or damaged or otherwise not normal human tissue. (See image at right.)Pity the poor medical expert, then, who must deal with not only two-dimensional images but also interpret information from three-dimensions, such as is the case with magnetic resonance imaging or computer-aided tomography, to name just two imaging modes. These multidimensional data bring additional information, but are also much more difficult to process and interpret.But with advanced algorithms such as clustering, the really useful informatio

April 14, 2010

Research Report - Turning the microscope inwards: Studying scientific software ecosystems As an example, this slide displays the logos of all of the software in the Eclipse software ecosystem. Image courtesy of James Howison and James Herbsleb. Almost every workflow that generates scientific results today involves software: from configuration and control of instruments, to statistical analysis, simulation and visualization. This means that creating and maintaining software is a significant activity in scientific laboratories, including science and engineering virtual organizations. Our research group at Carnegie Mellon University is examining scientific software as an ecosystem, seeking to understand the circumstances in which software is created and shared. The goal of this Open Science Grid project is to identify effective practices and provide input to science funding policy. Towards this end, the OSG/CMU Scientific Software Ecosystem Workshop was held 16-17 February 2010 in Los

April 7, 2010

Lights, camera, action: FilmGrid

Image courtesy FilmGrid

Film-making is a very labor-intensive craft, relying upon the work of many people.
This is especially true of the part known as “post-postproduction” — traditionally, that portion of the process when all the raw film has been shot and is “in the can.” During this phase, all the editing, natural sound, music, background painting, voiceovers, montages, special effects, and everything else take place.
Because so much of post-production is manual, and because so many hands are involved — and because post-production often involves widely scattered individuals and companies — it can often be very difficult to maintain an up-to-date picture of the status of a film production, leading to inefficiencies, unwanted duplication of effort, and complications. In addition, couriers sometime lose hard disks containing footage with terabytes of information, and security can often be difficul

March 31, 2010

Campus grids secret to productive grid sites A picture of FermiGrid, the campus grid at Fermi National Accelerator Laboratory. Image courtesy of Fermilab Visual Media Services.Photo by Reidar Hahn. Fostering a local campus grid may be the secret to running a super-productive grid site, according to Rob Gardner, integration coordinator for Open Science Grid. “We noticed certain sites on Open Science Grid that were very productive relative to their peers,” Gardner said. “The common denominator was that they were accessing resources beyond the scope of their research domain by accessing resources on their campus.” That realization prompted OSG to look more deeply into existing campus grids. The idea, Gardner said, was to find out what the sites were doing right, and to learn from them in three key areas: job management, data management, and identity management. The ultimate goal? To arrive at a set of best practices for establishing and running campus grids.

March 31, 2010

Grid security vulnerabilities: keeping out of the headlines

Tire tracks of vehicles that have gone around a closed entrance gate. Sebastian Lopienski of CERN uses this slide in his presentations on security, with the label “This is not good security.”

Most people are now familiar with the need to keep their computer systems up to date, whether installing Windows updates or Linux updates to ensure that the systems they use do not contain known vulnerabilities. Many are also aware that vulnerabilities make the news occasionally, for example the recent Microsoft Internet Explorer problem “IE Zero day used in Chinese cyber assault on 34 firms.”
Vulnerabilities can be seen as analogous to finding that a type of lock can be easily picked, or that a small hole can let in a tiny creature that then expands into a monster.
So what is happening in the grid world? Are the problems any different? Is anything being done to prevent or fix vulnerabilities? In fact, a lot ha