One of the highlights of the recent TedxCERN conference was a talk given by Ian Foster, widely known as one of the founders of grid computing. He also spoke to an audience of CERN IT department staff about Globus Online and the challenges of big data.
Niall Gaffney has managed some of the richest astronomical data ever recorded in terms of scientific and public impact. In his new role at the Texas Advanced Computing Center (TACC), Gaffney will oversee the center's big data strategy, and will help TACC address challenges with core techniques and technologies.
Scientists at the University of Oklahoma are developing advanced data analysis techniques to discover why some storms generate tornadoes and others do not. Read about these techniques and their potential impacts on early warning systems and forecasts.
With access to significantly more computational power, researchers can provide more accurate earthquake predictions with the potential to save lives and minimize property damage. Read about advances in developing code to cut both research times and energy costs in simulating seismic hazards.
Renowned computer scientist Paul Messina delivers Peebles Memorial Lecture at Indiana University in Bloomington, US. The university awarded Messina the distinguished Thomas Hart Benton Mural Medallion at the dedication and launch of Big Red II, the fastest university-owned supercomputer in the nation.
In 2012, the United States suffered its worst drought in 24 years, which led to the worst harvest yields in nearly two decades. Read how scientists are using this data to validate crop yield and climate impact models that simulate the effects of climate on agriculture.
City lights, sand dunes, and glaciers –oh my! Take a look at NASA’s eye-catching images of Earth from orbit, including true-color satellite images, Earth science visualizations, and time lapses from the International Space Station.
The University of Texas at Austin and TACC competed against the top supercomputing centers and universities to claim one of the most advanced systems in the world — and won. The prize, an estimated $50 million-plus investment over a four-year period.
On February 15, 2013, as an asteroid entered Earth’s atmosphere and exploded over the Chelyabinsk region in the Russian Urals. Many amateur videos from the region captured the asteroid streaking across the sky and exploding in a bright light. Read about the science behind the atmospheric event and similar impacts on Jupiter.
We are living in the golden age of exoplanets — over 800 are known, and new discoveries are announced weekly. Find out about the vital role high-performance computing is playing in enabling these discoveries.
Faced with the daunting prospect of profiling the complexities of the immune system, researchers at Harvard Medical School/Business School enlisted the help of the world’s largest community of software experts on the site TopCoder. A recent paper in Nature Biotechnology indicates a cultural shift in academia with experts engaging the collective skills of those outside their community, in order to help them overcome methodological barriers to their work.
With more than 10,000 species of birds known to exist, scientists know little about their diversity and development over time. Freely licensed software developed at the University of Utah, US, has enabled researchers to pinpoint a single gene responsible for some very glamorous hairdos.
High-performance computing veteran Thomas Sterling will be delivering a keynote speech on HPC achievement and impact at this year's International Supercomputing Conference in Leipzig Germany (ISC'13). He speaks exclusively to Nages Sieslack...
Calling all citizen scientists. With the exploding availability of data, the need for analysis is steadily becoming a bottleneck in many scientific pursuits. Read about a project aimed at bringing neuroscience to the masses in a way that may surprise and inspire you to take part.
How do scientists use supercomputers to predict complex things like weather, climate, earthquakes, and the formation of galaxies? Watch this video to see how supercomputers handle mathematical modeling.
There is a very real and growing disparity between the ability to capture data and the ability to analyze and visualize it, and turn it into usable intelligence. Read about efforts to aid organizations and agencies in making sense of what they see.
Using software to predict how proteins fold at the molecular level, scientists have discovered new information about misfolding and the submolecular level energies involved. Read about the open source software used for simulations, and the potential implications for treatment of degenerative diseases.
Scientists now have enough data to analyze brain activity using graph theory. Read about their unique approach and their discovery of how the brain could code and recall spatial and temporal memories at the same time.
How much of an impact does temperature have on biological systems? Scientists at Oregon State University, US, are investigating. A grant from the US National Science Foundation will enable continuous thermal imaging from enzymes to ecosystems.
Computational fluid dynamics (CFD) simulations can easily generate 100+ terabytes of data. Scientists and engineers depend on supercomputers with hundreds of thousands of computing cores to solve the complex equations involved. Read about successful simulations breaking the million-core barrier.
A new graph analytics appliance – Sherlock – is designed to speed up the modeling process and open doors to a wide range of scientific research. Launched in February, Sherlock will enable scientists and researchers to better understand the often hidden, complex relationships in big data.
A researcher from the University of Missouri has broken a 4-year-old record for the largest ever prime number discovered. The discovery was made thanks to the Great Internet Mersenne Prime Search, a distributed computing project established in 1996.