Share |

Content about XSEDE

May 6, 2015

You may think XSEDE is nothing more than access to high-performance computing resources. But did you know XSEDE offers a full range of training opportunities to teach your scientists and engineers how to work with supercomputers?  

April 1, 2015

Renowned computational biologist Klaus Schulten used the supercomputers at the Texas Advance Computing Center and the National Center for Supercomputing Applications to model the binding force of proteins found in cow stomachs. What he discovered may lead us to cheaper biofuel production.

March 4, 2015

Kelly Gaither, co-principal investigator on the XSEDE project and director of visualization at Texas Advanced Computing Center, offers further thoughts on the topic of gender diversity following our 28 January article 'Why aren’t there more women in HPC?' by Toni Collis.



February 18, 2015

UC San Diego School of Medicine scientists have teamed up with Pittsburgh Supercomputing Center engineers to create 3D models of enzymes that cause inflammation. What they've learned could bring relief to victims of asthma and arthritis.


January 28, 2015

Austin is a booming city experiencing traffic woes commensurate with its expansion. To model and visualize solutions, city planners look to TACC to help corral the stampede of visitors.

September 10, 2014

The financial services and medical insurance industries in the US account for 6-8% (more than a trillion dollars) of the gross domestic product annually, according to the US Department of Commerce Bureau of Economic Analysis. Read about XSEDE-supported research that considers unobservable market phenomena and how both industries could be improved.

August 27, 2014

Moving water through the state of California, US, involves a complex array of local water districts, aging federal projects, cobbled together state projects, pumps, and levees. Read about the Delta Stewardship Council and the effort to tackle the enormous goals of creating a reliable water supply for a growing state population, and trying to reverse the environmental damage done by 160 years of water development.

August 13, 2014

Next-generation sequencing (NGS), in which millions or billions of DNA nucleotides are sequenced in parallel, is the backbone of novel discoveries in life sciences, anthropology, social sciences, biomedical sciences and plant sciences. Read about the SoyKB and iPlant collaboration that is taking plant sciences to the next level.

April 10, 2013

Blue Waters, entering full deployment, is now crunching numbers around the clock at the National Petascale Computing Facility at the University of Illinois, US. Led by the National Center for Supercomputing Applications Blue Waters is funded by the US National Science Foundation to address the most challenging compute-, memory-, and data-intensive problems in science and engineering.

November 14, 2012

Researchers have discovered the mechanism of how a dangerous class of carcinogen stabilizes the very DNA it damages. This finding could lead to better preventative medicine and cancer treatment.

October 17, 2012

Today, humans are generally removed from the process of trading on the financial markets; they have been replaced by algorithms and supercomputers. As such, trading speeds have progressed from microseconds to nanoseconds. Now, researchers are keeping up by using a supercomputer to catch a supercomputer.

September 19, 2012

Today, biology is a data-driven science, intrinsically linked to computing.  However, dialogue between researchers and computer scientists about the analysis of genomic data is still far from 'organic'. The HPCBio group and the Texas Advanced Computing Center are addressing this issue with a distributed approach to computing resources, new software, and tailored customer services.

August 29, 2012

Today's Kraken favors numbers instead of devouring sailors: This high-performance computer is managing the mountains of data streaming from NASA’s Kepler space telescope and is helping the search for Earth-like planets that orbit their stars in the 'Goldilocks Zone'.

August 29, 2012

During last month's XSEDE’12 conference, Gayatri Buragohain, founder of Feminist Approach to Technology in India, highlighted the importance nurturing women's involvement in science, technology, engineering, and mathematics at an early age. 

August 15, 2012

Climate simulations carried out using the XSEDE grid-computing infrastructure predict that the US is likely to be affected by the levels of extreme drought it is currently experiencing in 20 years out of every 50 by the end of this century.



August 8, 2012

At the XSEDE’12 conference in mid-July, John Towns, principal investigator for XSEDE discussed the challenges and successes of the US high-performance computing network's first year. XSEDE is building bridges between universities, users, and other cyber-infrastructures including the Open Science Grid for high throughput computing. Towns also mentioned the collaboration between XSEDE and the European computing infrastructure called PRACE and unveiled a new joint call to enable users to use both sets of resources.

August 8, 2012

There is no consensus on Alzheimer's disease underlying mechanism. Now, new simulations may lead to better diagnostic and treatment options to stop the disease.

August 8, 2012

During XSEDE'12, the Texas Advanced Computing Center presented their new high-performance computer that will replace both its Ranger system and Kraken at the National Institute for Computational Sciences.

July 4, 2012

Natural disasters claim thousands of lives and cause billions in economic damage each year. We look at five promising research projects that use the latest computational methods to predict or forecast these devastating events and their effects.

February 15, 2012

As funders consider how to best invest in the future of e-infrastructure, a number of questions arise: How much does e-infrastructure cost? How much impact does it have, and are funders getting value for their money? A number of projects are working to find concrete answers to these and other questions, at a time when getting an accurate price has never been more crucial to a sustainable future for scientific computing.