Share |

Content about Technology

October 31, 2012

Scientific computation is a research field with an excellent track record when it comes to technology and knowledge transfer. Now, with the global economic crisis taking its toll on public science budgets, the need for other research communities to learn from this example has never been greater.

December 15, 2010

Original courtesy David Alan Grier

It may look like just another Yuletide scene in an office in 1958, but this one is something special.

December 1, 2010

Read about how the EpiCollect application can help field researchers gather data.

December 1, 2010

Some kids can make anything cute, and these ones are no exception.

November 17, 2010

Announcement - Last chance, Data Center Infrastructure Management, Nov. 16-18, MERIT network

Photo courtesy Merit

The three-day "Data Center Infrastructure Management" online learning class will be available November 16-18 through Merit's Professional Learning program. Merit is a nonprofit corporation, owned and governed by Michigan's public universities, providing high-performance networking solutions to public universities, colleges, K-12 organizations, libraries, state government, healthcare, and other non-profit organizations.
Instruction will be entirely online — you can attend from anywhere where there is an Internet connection. You can also attend at Merit's offices if you need a space away from your daily demands. The course will provide knowledge about the professional management of data facilities, which is increasingly important for organizations of all types and sizes. It is of interest to individuals who manage data centers or server rooms, IT staff with hardw

November 17, 2010

Announcement - StratusLab releases open source cloud solution for grid

Photo courtesy of OpenNebula.org

StratusLab has released the first open-source cloud solution designed for the grid.
The StratusLab project has released the first version of its cloud computing software, which aims to provide a full cloud solution for grid and cluster computing.
The release is a technology preview (beta test) and not production-ready yet, but it will give system administrators and users a chance to try out the new features of what will become an integrated solution for cloud management and running grid services within clouds.
The software is based on the OpenNebula open-source toolkit for cloud computing management and can be used as an interface for managing cloud sites. It also provides a range of tools and services specifically designed to facilitate integration of cloud and grid technologies. These include automatic configuration of sites and integration with fabric management tools such as the

November 17, 2010

Feature - The 1970s in the 21st century: synthesized music returns (via parallel processing)

This Arp 2500 analog modular synthesizer from 1971 had hundreds of inputs and outputs for control of the synthesis processes. Image courtesy of discretesynthesizers.com.

Curtis Roads is a professor, vice chair, and graduate advisor in media arts and technology, with a joint appointment in music at the University of California at Santa Barbara. He also was the editor of the Computer Music Journal (published by MIT Press), and co-founded the International Computer Music Association. He is often a featured speaker at conferences such as Supercomputing. 
 
Music is an interactive, concurrent process. A note or a chord sounds, then is replaced, gradually or sharply, softly or powerfully, by the next one. For electronically produced or enhanced music, real-time technical advances are critical to continued progress and exploration. In the 1970s, I fondly remember learning my first parallel

November 17, 2010

Image of the Week - Elegance of darkness When galaxies collide. Original courtesy Argonne National Laboratory What you see is the collision of two galaxies over billions of years, albeit virtually. As physicists at CERN investigate the smallest particles in the universe, US scientists are studying the behavior of the largest cosmic structures in existence. A team at the University of Chicago Flash Center and the Harvard-Smithsonian Center for Astrophysics used an Argonne National Laboratory Supercomputer to identify elusive dark matter. The researchers simulated the motion and collision of galactic clusters — some of the largest structures in the universe — to infer dark matter’s influence, as it cannot be observed directly. Dark matter greatly influences gas and galaxies over trillions of light years.  Furthermore, these collisions more accurately predict the interaction of both normal and dark matter (it is thought dark matter constitu

November 3, 2010

 

Project Profile - From grids to clouds and beyond: GRNET supports Greek researchers

The Acropolis from Philipapou Hill at sunset, Image courtesy Tim Rogers, stock.xchng

All Greek universities get their internet from one source: GRNET (Greek Research and Education Network), a company supported by the Greek state, which connects them both to each other and to the larger pan-European academic network, GÉANT.
GRNET’s mission is to get universities on line, to provide computing power and storage, and to develop services for researchers. Not the least of which is providing technical know-how and supporting schools and universities in Greece. “GRNET is actually a human network — this is the most important thing about it,” says Kostas Koumantaros, member of GRNET in Athens. “We transfer know-how between universities throughout Greece. It is a good vehicle to both promote research in Greece and for us to learn from our international collabora

November 3, 2010

Feature - LHC open to all

An actual recorded event from the Compact Muon Solenoid experiment—this event shows radiation and charged particles spilling into the detector from the beam colliding with material in the beam pipe.
Image courtesy Carl Lundstedt

Occasionally, iSGTW runs across stories in other publications related to the fields we cover. Below is an excerpt from Linux Journal, containing one person’s view of the whole process.
One of the items at the heart of the Large Hadron Collider (LHC) experiments is open-source software. The following will provide a glimpse into how scientific computing embraces open-source software and its open-source philosophy.
The LHC at CERN near Geneva, Switzerland, is nearly 100 meters underground and produces the highest-energy subatomic particle beams on Earth. The Compact Muon Solenoid experiment is one of the many collider experiments within the LHC. One of its goals is to give physicists a window into the universe fractions

November 3, 2010

Feature - Ultra-fast networks: The Final Frontier A network researcher in awe of the billions of dark matter particles simulated on 15 ultra-high definition monitors. Image courtesy Freek Dijkstra Researchers from Holland have demonstrated a network infrastructure that could potentially help scientists save time and even transform the movie business. This could be done without the need for large computer clusters or grids, just off-the-shelf hardware components combined with human ingenuity and one of the world’s fastest research networks. The team were from SARA, a Dutch supercomputing and e-science support center. Threshold The SARA researchers wanted to show the practicalities of streaming video between two institutions (from SARA, Amsterdam to CERN, Geneva) at 40Gb/second (5GB/s). This link, if successful, would be 16 times faster than the TEIN3 network, which streamed Malaysian dancers over 9,000 kilometers away to a live orchestra performance in Stockholm at 2.5 Gb/s. Th

October 20, 2010

Announcement - ESFRI and e-IRG publish ‘Blue Paper’ on e-Infrastructure

Photo courtesy  ESFRI

The European Strategy Forum on Research Infrastructures (ESFRI) and e-Infrastructure Reflection Group (e-IRG) has just released a report about the current trends, issues and policy areas for users of Europe's e-Infrastructure services.
Topics that are covered include:

Networks
Computing
Middleware
e-infrastructure services to support scientific research.
e-infrastructure as a European service.
Digital research infrastructure for the Arts and Humanities.
e-science and technology infrastructure for biodiversity data and observatories.
And much more . . .

The full report can be downloaded in pdf form.

October 20, 2010

Announcement - New European Petaflop supercomputer available in 2011

Photo courtesy PRACE

In 2011, the 1.6 Petaflop French supercomputer, Curie, will be installed and available for use. Powered by more than 90,000 processor cores, it will be exclusively dedicated to European research and available for all fields of science, including high-energy and plasma physics, climatology and much more.
“It is crucial to have high computing power to simulate, with the most possible realism, the past of our climate, the current conditions and its future evolution according to various scenarios,” said Jean Jouzel, vice-president of the IPCC (Intergovernmental Panel on Climate Change).
Scientists and engineers will also be able to use Curie’s simulations to explore the properties of various materials, improve aircraft and car construction, design better drugs, understand the intricate molecular functions of the human body and conduct simulations that are impractical in reality.
Cur

October 20, 2010

Feature - Climate model tackles clouds

flowplayer("player", "http://www.isgtw.org/swf/flowplayer-3.2.5.swf");

Animation from the NICAM model simulation of 21 May - 31 August 2009, showing cloudiness (based on outgoing long-wave radiation) in shades of gray and precipitation rate in rainbow colors, based on hourly data from the simulation. The cloudiness is shaded in brighter gray for thicker clouds, and the colors range from shades of green, indicating precipitation rate less than 1 mm/day, to yellow and orange (1 - 16 mm/day), to red (16-64 mm/day) and magenta (> 64 mm/day). The animation begins zoomed in over India and the Bay of Bengal, showing the fact that tropical cyclone Aila, which in reality made landfall near Calcutta killing dozens of Indian and Bangladeshi citizens and displacing over 100,000 people from their homes, was very accurately predicted in the simulation.
Video and caption courtesy NICS

Few areas of science are currently hotter than clima

October 20, 2010

Feature - New physics in space

A C5 Supergalaxy, one of the world’s largest planes, loading the AMS-02 experiment at Geneva Airport. Image courtesy CERN Bulletin

New life was breathed into the International Space Station (ISS) this year after NASA announced it will extend the ISS from 2015 to at least 2020.The new deadline extends opportunities for science experimentation in the largest space research laboratory ever constructed. One of these experiments is the Alpha Magnetic Spectrometer (AMS-02), a detector that may help scientists understand why our universe exists and why there is more matter than anti-matter.Most space-grade electronics are about ten years old, so the AMS-02 represents the newest and most advanced physics experiment in outer space to date. Currently, it is being tested and due to launch in February 2011. AMS-02 was shipped via Geneva airport to NASA this August in one of the largest planes in the world, a US Air Force C5 Super Galaxy.Once aboard the ISS, A

October 20, 2010

 

Link of the Week - Nobel Prize follows Ig Nobel

Artist's impression of a graphene transistor. Image courtesy physorg

A first has just occurred in the world of Nobel Prize awards: Andre Geim, a Russian-born physicist, who was previously awarded an Ig Nobel for using magnets to levitate a frog, received a Nobel Prize in Physics for his experiments on a 2-D substance called ‘graphene.’ Graphene, which is one-atom thick and entirely made from carbon, comes from the ‘lead’ in a pencil.
 
The substance is made up of a handful of atoms in a honeycomb lattice, akin to atomic scale chicken wire. At this scale,  its properties truly shine as Andre and his team discovered that the material conducts electricity 100 times faster than silicon. Possible future applications of this material could be for the creation of ultra-fast transistors for the next generation of computers, electronics, smart displays and quantum-dot computers.
 
Graphene&rsqu

October 20, 2010

Profile – Domenico Vicinanza, master of fusion Musicians play ancient instruments live in Stockholm while dancers in Kuala Lumpur about 10,000 kilometers away simultaneously perform on the display above the stage. (Click on image above to see video of entire performance.) All images courtesy Domenico Vicinanza Domenico Vicinanza combines the worlds of science and music by using his talents as an engineer and a musician to bring ancient musical instruments back to life. In December 2009 Vicinanza and the 'Lost Sounds Orchestra' gave a unique performance. While playing ancient Greek music live in Stockholm on a virtual instrument, an ultra-fast, high-quality video-feed of dancers from Kuala Lumpur was displayed — simultaneously bringing two distant cultures and locations into one place. iSGTW caught up with Vicinanza for an interview.   iSGTW: What’s your job? Vicinanza: At DANTE I support international projects that use the GÉANT network, the pa

October 13, 2010

Feature - Computational chemistry suite goes open source

An image generated using NWChem.
Image courtesy Eric Bylaska, EMSL.

A widely-used suite of computational chemistry software, NWChem, was released as open source on 29 September 2010.
The suite, which has already been used on conventional workstations clusters and high-performance supercomputers, has already been downloaded over 500 times.
Currently, the NWChem core development team is working on porting the code to work with GPUs and in cloud computing environments.
“It’s a very comprehensive software suite and covers almost all the theoretical approaches currently being used by computational chemists and materials scientists who use first-principles quantum mechanical calculations in their research,” said Niri Govind, a member of the NWChem core development team based at the Environmental Molecular Sciences Laboratory at Pacific Northwest National Laboratory.
NWChem first came on the scene in the mid-199

October 6, 2010

Feature - A lasting ocean observatory

A map indicates the location of the four major ocean arrays, as well as the two minor ones. Click for a larger version. Image courtesy of OOI - CEV at University of Washington.

Agile architecture is essential if a large-scale infrastructure like the Ocean Observatories Initiative is to last three decades, as mandated.
“The Ocean Observatory has been in planning for fifteen years and more,” said Matthew Arrott, OOI’s project manager for cyberinfrastructure. “It is our anticipation, over a 30 year lifespan, that we need to account for user needs and the technology that we are using all changing.”
That’s why they’ve focused their attention on creating an infrastructure that can interface with a wide variety of software packages and computational resource providers.
“The observatory supports a broad range of analysis with the expectation that the majority of the analysis capability will be provided a

October 6, 2010

 

Link of the Week: Einstein@home bags a pulsar

Albert Einstein (c) Camera Press, K. of Ottawa

The Einstein@Home volunteer computing project, run on the BOINC platform to run distributed computing projects, usually   searches for gravity waves. (See previous iSGTW article.) However, a side project spotted a rare pulsar in radio observatory data.Pulsars are rapidly spinning neutron stars; their rapid rotation causes the emission from the poles to sweep across the line-of-sight to the Earth, creating a periodic flash. Initially, most pulsars are energetic, rotating rapidly and emitting radiation in the X-ray region. But, over time, they “spin down;” many only emit at the frequency of radio waves.This summer, a person at a home computer spotted PSR J2007+2722, later confirmed by ground-based observatories.An article in the journal Science praised the efforts of citizen scientists, saying that “This result demonstrates the capability of 'consumer' comput

September 29, 2010

Link of the Week - GPU Technology Conference

This photograph was taken at the 2009 GPU Technology Conference.
Image courtesy nVIDIA.

Last week, computing expert Greg Pfister told us why he thinks that cheap GPU-based supercomputing is coming to an end. And as iSGTW readers from around the world read that article, other computing experts gathered in San Jose for the GPU Technology Conference, an event with a very different underlying perspective.
Sponsored by nVIDIA, the GTC offered separate streams for researchers, developers, and industry.
NVIDIA created daily video recaps for the three-day event. The source aside, they are a quick and convenient way to find out about some of the most interesting presentations from the event.
You can go directly to video one, two, and three to watch them, or read more about the conference by browsing VizWorld posts tagged as “gtc” at our link of the week.

September 22, 2010

 

Link of the Week: When computers were human

Image courtesy David Grier

The photo at right may look rather ordinary but in fact this office is much more exciting than it seems. The men and women working away at these desks are in fact a sort of human computer, employed by the Mathematical Tables Project in New York City in the 1930s and ’40s under a Works Progress Administration program to fight the ravages of the Great Depression.The Mathematical Tables Project consisted of 450 ‘human computers’ — many of whom had been close to homelessness during the financial collapse. The large majority of the staff had not even completed high school, yet they were brought together to perform calculations for government and scientists in an era  before the first working general-purpose, electronic computer (generally agreed to be ENIAC, or “Electronic Numerical Integrator And Computer”).The Mathematical Tables Office Computing Floor shown in the pho

September 15, 2010

Announcement - Free HPC event, 30 September

The event will take place in Long Beach, California, pictured above.Image courtesy of Jon Sullivan, under Creative Commons license.

You are invited to attend a free day-long event on high performance computing technology in Long Beach, California, USA on 30 September 2010.
The event, which is sponsored by nVIDIA, Intel, HP, Mellanox Technologies, Adaptive Technologies, and CB Technologies, will take place from 8:30am to 3:00pm.
Agenda:

"CPU Processing" - Intel
"Accelerating HPC with GPU Computing" - Nvidia, Dale Southard
"Paving the Road to Exascale Computing" - Mellanox,
"Managing HPC with intelligent policies/HP Cloud" - Adaptive Computing
"Parallel File Storage Solutions" - DataDirect Networks
"HPC CHARACTERIZATION AND OPTIMIZATION" - Hewlett-Packard Company

 Confirmed Speakers:

HP, Logan Sankaran, System Tuning and Application Optimization for HPC
Nvidia, Dale Southard
Mellanox, Gilad Shainer, Sr. Director of HPC an

September 15, 2010

Feature - Neighborly efficiency: Scaling kNN problems  (nearly) linearly using GPUs

This image depicts an example of a two-dimensional feature space. In this case, the unknown dot would be classified as “green,” because three of the five nearest neighbors are green.
Image courtesy of Cyrus Stoller and Libin Sun.

What is k-Nearest Neighbor?
The class of machine learning algorithms known as kNN classifies objects based on the closest training examples in the feature space. What does that mean? Let us illustrate that with an example.
Imagine you have a large set of digital images of friends and family. Many of them have already been tagged according to the person in the image. You would like a way to automate the tagging of the remaining photographs.
First, you would need a program that can analyze images of faces, quantifying features such as eye color, the distance between the eyes, and so on. The program would take in each image – say, 1000 – and re

September 1, 2010

Announcement - Gordon Conference 2010 abstracts due 16 September

The Grand Challenges in Data-Intensive Discovery conference (or Gordon Conference for short) will be held 26-28 October 2010 at the San Diego Supercomputer Center on the campus of UC San Diego.
Science has entered a data-intensive era, driven by a deluge of data being generated by digitally based instruments, sensor networks, and simulation devices. Hence, a growing part of the scientific enterprise is associated with analyzing such data, and such analysis places special demands on computer architectures because the associated calculations have frequent I/O accesses, large memory requirements, and often limited parallelism.
In mid 2011, SDSC will deploy a unique data-intensive high performance computing system called Gordon. Gordon will be a peer-reviewed allocated resource on the National Science Foundation's TeraGrid available to any US researcher. It will have a peak speed of 245 Teraflops and feature very large shared memo