ISGTW and its new partner, Indiana University, are pleased to announce that Amber Harmon has joined our team in the newly created position of US desk editor of iSGTW, which is based in Indiana University, US. We hope you enjoy reading her first feature story for iSGTW.
Last week, scientists, technologists, and industry leaders from around the world gathered in Salt Lake City, Utah, for the annual International Conference for High Performance Computing, Networking, Storage and Analysis. More commonly known as SC12 (or Supercomputing 2012), it is one of the field’s premier conferences.
For those new to supercomputing, like many of the over 9,000 SC12 attendees, the scientific innovation the field enables is staggering. Current supercomputers have reached petaFLOPS (quadrillions of calculations per second) processing speeds. In fact, during this week’s conference, Titan, a Cray XK7 system capable of 17.59 petaFLOPS, was announced as the world’s fastest supercomputer in the 40th edition of the TOP500 List.
Researchers and scientists employ increasingly complex theories, algorithms, and model simulations and supercomputers are critical resources both for processing speed and storage capacity. Simulating the composition and activity of the human brain, for example, is one of the most data-intensive areas of neuroscience research. SC12 keynote speaker, popular science author, and theoretical physicist Michio Kaku is confident supercomputers will “alter the face of medicine as we know it” and predicts the human brain will be the next frontier of life-altering scientific research.
Humans have about 100 billion brain cells, as well as 100 trillion synapses to help those brain cells communicate with other brain cells. Developing a detailed model of just one cell and its potential to interact with another single cell generates an extreme amount of data, pushing supercomputers to grow larger and faster. Moreover, while neuroscience researchers have long studied various individual parts of the human brain, they are missing an “integrated strategy to synthesize and put all of the pieces together: genetics, synapses, proteins, neurons, microcircuits,” notes Henry Markram, director of the Blue Brain Project in Switzerland.
The Blue Brain project aims to create a fully functioning virtual brain from the cellular level up. This modeling approach requires huge datasets, making supercomputers an “essential tool for the future of neuroscience,” says Markram. The good news: as scientific advancement pushes supercomputers to keep up, the ability to simulate realistic neurological scenarios is quickly becoming a reality. Such simulations may make it possible to identify precursors to diseases like Alzheimer’s, Parkinson’s, epilepsy, schizophrenia, and depression.
However, as Kaku notes, creating a viable model of the human brain is not the biggest challenge. Instead, whether continued investment in technology and supercomputing will be available for sustained advancement is the biggest challenge.
Sue Fratkin is a leading proponent for continued funding of high performance computing. She is the Washington liaison for the Coalition for Academic Scientific Computation (CASC, an alliance of 71 academic supercomputing centers in 38 states) – and an active voice for technology policy in higher education.
Fratkin was on hand at SC12 to meet with CASC members and speak on continued efforts to make US legislators aware of the critical role of supercomputers in accelerating scientific discovery. Fratkin says she is looking forward to the new year and assessing how congress, with more than one third of the House having less than three years of experience, will work together on issues. She is quick to acknowledge re-educating these members on areas impacted by supercomputing may be a bigger challenge than simply working across party lines.
Michio Kaku, Henry Markram, and Sue Fratkin all agree: both the ongoing advancement and the economic feasibility of supercomputers are key to propelling scientific discovery – not only in the areas of health and medicine, but also for environmental and energy research, engineering, atmospheric prediction, security, and education. As events such as SC12 demonstrate, high performance computing is the critical thread ensuring economic competitiveness, environmental stewardship, and leadership in the 21st century.