Share |

Content about North America

May 27, 2015

When politicians can't agree on spending priorities and decide to shut down the government, federal functions come to a grinding halt. In the US, this scenario has repeated a dozen times since the Reagan era; Eric Svensen looked to a supercomputer to find out why.

May 27, 2015

Part one of a free-wheeling conversation with organizers of the Internet2 Gender Diversity Initiative at the recent Internet2 Global Summit in Washington, DC.

May 27, 2015

Over 100 people converged to assemble Purdue University’s latest research supercomputer, Rice — in a single day!

On Friday, 8 May, staff and volunteers built a new cluster of 576 HP compute nodes with two 10-core Intel Xeon-E5 processors (20 cores per node) and 64 GB of memory. The cluster features a Mellanox 56 GB FDR Infiniband interconnect and a Lustre parallel file system built on Data Direct Networks' SFA12KX EXAScaler storage platform.

May 20, 2015

Carnegie Mellon University computer scientists looked to Pittsburgh Supercomputing Center supercomputer Blacklight in their construction of Claudico, a poker-playing artificial intelligence. Claudico came up short against the world's best poker players, but what the scientists have learned spells good news for medical decision-making.

May 20, 2015

At the recent Internet2 Global Summit iSGTW sat down with George Komatsoulis to talk about the state of distributed research and the NIH Commons, a scalable virtual environment to provide high-performance computing and data storage for bio-medical research. When implemented, the Commons will create a marketplace for digital bio-medical resources, driving down costs and democratizing access.

May 13, 2015

Scientists at the University of Houston have shown the mind can triumph over matter. For the first time, researchers have demonstrated prosthetic grasping control can be inferred from EEG without invasive measures.

 

May 13, 2015

Small colleges and remote learners need not worry about limited access to quality science equipment. The NANSLO solution is to bring the lab to you.  

May 6, 2015

The 25 April Nepal earthquake has killed more than 7,000 people and destroyed hundreds of thousands of homes. The deadliest earthquake in Nepal since 1934, the tremor killed at least 19 climbers and crew on Mount Everest and reportedly produced casualties in the adjoining countries of Bangladesh, China, and India.

In response, scientists at The Ohio State University and the University of Minnesota are directing supercomputing resources to aid in the disaster relief.

May 6, 2015

You may think XSEDE is nothing more than access to high-performance computing resources. But did you know XSEDE offers a full range of training opportunities to teach your scientists and engineers how to work with supercomputers?  

April 29, 2015

This issue marks the 10th anniversary of iSGTW. We would like to take this opportunity to thank all our readers and all those who have contributed to the publication over the last decade.

April 29, 2015

As iSGTW celebrates its 10th anniversary, Katie Yurkewicz, the publication’s first editor, looks back at the challenges of establishing an e-newsletter to support the fledgling grid-computing community and highlights how the publication has evolved.

April 29, 2015

Researchers at the Los Alamos National Laboratory used supercomputers to model ocean vortexes and their effect on floating oil rigs. Their work has won industry awards — increasing safety and reducing potential harm to deep sea environments.

April 29, 2015

Germinated bacillus anthracis spores stained and imaged with a smartphone microscope modified for fluorescence. Courtesy PNNL.

April 22, 2015

Earthquake warning systems are an expensive proposition — but not when crowdsourced via smartphones. Scientists recently tested consumer devices, and were surprised at what they found.

April 22, 2015

Seismologists have always relied on surface observation to piece together models of what they thought Earth’s interior looked like. These models served them well for years, but they were unable to map out the planet’s interior with certainty, until now. A team of scientists is using the powerful US Titan supercomputer to do just that.

 

April 15, 2015

Mutations pose a risk to crop yields and our ability to feed a burgeoning population. With the aid of cloud computing clusters, Cornell researchers were able to predict where bad mutations are likely to occur. Genetic editing will remove these harmful mutations and allow breeders to continue increasing yield gains.

April 15, 2015

Buying a supercomputer can be a tough sell for administrators to make. A study by Clemson University researchers may change the argument.

April 8, 2015

Using the Oakley supercomputer and a very small, frozen tuning fork, Joseph Heremans is rewriting our science textbooks. His computational research team has discovered that phonons — sound and heat particles — yield to magnetic fields.

April 8, 2015

They’re only 1/100th the width of a human hair, but these little sensors are kind of a big deal. Dubbed geometrically encoded magnetic sensors (GEMs), they have the ability to change shape once inside human tissue – and they provide greater accuracy than current technology.

April 1, 2015

Naturally occurring crystalline structures called zeolites have the ability to separate molecules, speed chemical reactions, rearrange atomic bonds, and break down long chains of hydrogen and carbon atoms. Scientists screened a large database of possible zeolite structures to find a few that can lead to cheaper ethanol and better engine lubrication. What they’ve discovered could mean greater engine efficiency and big savings for you.

April 1, 2015

Renowned computational biologist Klaus Schulten used the supercomputers at the Texas Advance Computing Center and the National Center for Supercomputing Applications to model the binding force of proteins found in cow stomachs. What he discovered may lead us to cheaper biofuel production.

March 25, 2015

 

Big data opens doors previously closed to researchers, yet the volume of data sends scientists looking for analytical tools to bring order from the informational cacophony. Prior to tools like Bioconductor, there were few options for working with quantitative data types; a discordant score for deciphering the human genetic code was the result.

Today, genomic analysis machines create a common language for users, and build a worldwide community to foster developers from among subject matter experts. These instruments make beautiful music from a mass of genomic information.

March 18, 2015

The Large Synoptic Survey Telescope will provide an unprecedented look into the cosmos, and the Dark Energy Science Collaboration is preparing a variety of analyses for the huge data sets it will produce. In anticipation of their needs, Fermilab is developing innovative software tools and approaches.

March 11, 2015

In a TED talk video, Altimeter industry analyst Susan Etlinger discusses the ethical implications of collecting big data, and gives her perspective on how to best analyze data that both extracts real insights and builds trust.

iSGTW recently followed up with Etlinger on her proposed framework and how she believes we should approach traditional analytics methodologies to ensure they account for the variations and complexities of big data.