Scientific computation is a research field with an excellent track record when it comes to technology and knowledge transfer. Now, with the global economic crisis taking its toll on public science budgets, the need for other research communities to learn from this example has never been greater.
Researchers are using the latest next-gen sequencing and a supercomputer to analyze unusual Geranium genomes, which are natural mutants, evolving many times faster than their plant peers. This could impact research on genetically-modified foods.
The latest computer simulations show that nuclear fusion reactions which produce more power than is put in are just around the corner - but recent experiments have called into question the accuracy of these computer models.
Today, humans are generally removed from the process of trading on the financial markets; they have been replaced by algorithms and supercomputers. As such, trading speeds have progressed from microseconds to nanoseconds. Now, researchers are keeping up by using a supercomputer to catch a supercomputer.
Artist by trade, Francesca Samsel, uses the highest resolution visualization system in the world, at the Texas Advanced Computing Center, US, to showcase scientific data in a highly accessible, but accurate way.
The first-ever direct observation of a Type 1a supernova progenitor system has been made. The team involved used data that traveled over 400 miles and which was analyzed using sophisticated machine-learning algorithms and high-performance computers.
With the IEEE International Conference on eScience 2012 under a week away, iSGTW caught up with Ian Foster, general chair and moderator of the event. We get his opinions on the key areas of focus of the program, the challenges being addressed, and what attendees should watch out for.
Today, biology is a data-driven science, intrinsically linked to computing. However, dialogue between researchers and computer scientists about the analysis of genomic data is still far from 'organic'. The HPCBio group and the Texas Advanced Computing Center are addressing this issue with a distributed approach to computing resources, new software, and tailored customer services.
With satellite data showing that the area of Arctic sea ice lost per day is now roughly equal to the size of the Czech Republic, iSGTW casts its spotlight on the computing infrastructure behind the observations.
Today's Kraken favors numbers instead of devouring sailors: This high-performance computer is managing the mountains of data streaming from NASA’s Kepler space telescope and is helping the search for Earth-like planets that orbit their stars in the 'Goldilocks Zone'.
Climate simulations carried out using the XSEDE grid-computing infrastructure predict that the US is likely to be affected by the levels of extreme drought it is currently experiencing in 20 years out of every 50 by the end of this century.
At the XSEDE’12 conference in mid-July, John Towns, principal investigator for XSEDE discussed the challenges and successes of the US high-performance computing network's first year. XSEDE is building bridges between universities, users, and other cyber-infrastructures including the Open Science Grid for high throughput computing. Towns also mentioned the collaboration between XSEDE and the European computing infrastructure called PRACE and unveiled a new joint call to enable users to use both sets of resources.
During XSEDE'12, the Texas Advanced Computing Center presented their new high-performance computer that will replace both its Ranger system and Kraken at the National Institute for Computational Sciences.