Share |

Saying goodbye to the command line

Compute cluster bathed in purple and gold. Image courtesy John Livzey.

Many of us benefit from interactivity with our smartphones, tablets, computers, or touchscreens. We’re able to see and hear data and media – and, depending on the format, we’re also able to swipe, stretch, and flip it. Unfortunately, when working with large supercomputing clusters, scientists and researchers don’t experience all of the advantages interactivity provides. That is changing, though, due in part to research at Louisiana State University’s (LSU’s) Center for Computation & Technology (CCT), US.

In 2011, CCT researchers began envisioning what interactivity with powerful supercomputers might entail. CCT’s Brygg Ullmer, an associate professor of computer science at LSU and primary investigator on a $1 million dollar US National Science Foundation grant, is leading a team of 40 co-investigators across 11 departments in five colleges. Their goal is to develop Melete – a system that integrates an interaction-oriented compute cluster with tangible interfaces – to support collaborative research and the classroom.

“In classrooms, laboratories, and meeting rooms, faculty today choose between real-time interaction with the limited capability of a laptop (or podium PC) or no interaction at all,” says Ulmer. “Through Hollywood, everyone is aware of the simulation potentials of large-scale computation. We aspire to bring some of these powers of interactive hurricane simulations, of flowing hair and animation work, to what students and faculty are controlling and experiencing live in the classroom, as well as at research meetings.”

iPad Mini. Image courtesy

Chris Branton, CCT’s IT consultant and adjunct faculty of computer science, has been leading software infrastructure development for the project. “Typically, a high-performance computer would feature one head node coupled with several slave nodes,” Branton says. “In contrast, Melete features several interactive face nodes in addition to the head node. The face nodes are a combination of dynamic screens, passive printed visuals, addressable LEDs, and other interactive elements to give interactive control of the machine to authorized users. We plan to place them in labs, meeting spaces, and classrooms, both at CCT and elsewhere on LSU’s campus.”

Five research domains are expected to benefit from Melete: Computational biology, materials, mathematics, engineering, and the arts. LSU Professor of Chemistry Les Butler, who co-leads the project, explains how the new system has helped his research on flame-retardants and X-ray interferometry for materials science. “This area of research is just a few years old, so our software is under rapid development. It is of tremendous advantage to use Melete with our new Mathematica codes,” Butler says. “The data rate of X-ray imaging is huge – a couple of days yields roughly one terabyte of data. How can we present these results to our collaborators? Melete helps us extract the good stuff.”

“We are using four iPad minis as our interface to fly through the optical, absorption, dark-field, and differential phase contrast – one image type per iPad. It’s strange, but it seems to work. We can discover features in the data sets as we walk to and from the coffee shop, when we would otherwise have been tethered to the workstation.”

“By the way, X-ray interferometry may soon appear in clinical applications as low-radiation dose imaging. Researchers at the National Institutes of Health are exploring this new X-ray method for applications such as mammography,” Butler adds.

The possibility of smart phones acting as several-hundred-core personal devices may be closer than we think – Ullmer and the Melete team are certainly moving in that direction.

Your rating: None Average: 3.5 (2 votes)


Post new comment

By submitting this form, you accept the Mollom privacy policy.