Speakers at the SC12 conference in Salt Lake City, Utah, highlight the role of scientific research in continuing to drive and benefit from high performance computing. They argued that a strong voice for the continued advancement of technology is vital in ensuring evolving research and economic leadership.
The Jim Gray eScience Award 2012 was presented to ChemSpider founder Antony John Williams at last month’s eScience conference in Chicago. ChemSpider is a free online database, containing chemical information on over 26 million molecules.
The Computing for Sustainable Water Project has run its course following six months of computation on the World Community Grid. The project has focused on the issues surrounding the quality of water in the Chesapeake Bay area, the largest watershed of the Atlantic seaboard of North America. The researchers behind the project hope to apply the lessons learned here to other watersheds around the world.
I love dragonflies: This video summarizes the Migratory Dragonfly Partnership. A nice example of how the project works can be seen in citizen scientist Greg Lasley's and researcher John Abbott's relationship. They're both friends and colleagues contributing to migratory dragonfly research. Image courtesy TACC.
Has big data now superseded the grid… and the cloud? Whether or not this is the case, it is vital that in moving from one paradigm to another we do not discard the experience and technology gained previously, says European Grid Infrastructure director Steven Newhouse.
In our tough economic times, some are asking how much value university academic research and e-infrastructures give to the economy. The answer is a lot – especially if you measure the contribution of recent university graduates – according to studies in the US.
Researchers are using the latest next-gen sequencing and a supercomputer to analyze unusual Geranium genomes, which are natural mutants, evolving many times faster than their plant peers. This could impact research on genetically-modified foods.
The latest computer simulations show that nuclear fusion reactions which produce more power than is put in are just around the corner - but recent experiments have called into question the accuracy of these computer models.
Today, humans are generally removed from the process of trading on the financial markets; they have been replaced by algorithms and supercomputers. As such, trading speeds have progressed from microseconds to nanoseconds. Now, researchers are keeping up by using a supercomputer to catch a supercomputer.
Artist by trade, Francesca Samsel, uses the highest resolution visualization system in the world, at the Texas Advanced Computing Center, US, to showcase scientific data in a highly accessible, but accurate way.
An online tool called Peachnote uses a distributed computing approach to enable instant identification of classical music scores. But, its terabytes of data may also redefine what we think we know about the classical greats like Beethoven and Mozart.
Imagine an open-access and open-source parallel computer with up to 45 gigahertz of CPU performance that fits onto the size of a credit card. A new Kickstarter project seeks to achieve all that for just $99.
The first-ever direct observation of a Type 1a supernova progenitor system has been made. The team involved used data that traveled over 400 miles and which was analyzed using sophisticated machine-learning algorithms and high-performance computers.
With the IEEE International Conference on eScience 2012 under a week away, iSGTW caught up with Ian Foster, general chair and moderator of the event. We get his opinions on the key areas of focus of the program, the challenges being addressed, and what attendees should watch out for.