Share |

7 December 2011

For the last 40 years, the way that large-scale services, such as global banks, and scientific experiments, such as the LHC at CERN, have been managing their data has been reminiscent of Lyman Frank Baum’s The Wonderful Wizard of Oz:there's only one tried-and-tested path, a 'yellow brick road', and it's relational databases. But over the last couple of years, some new, non-relational database models have emerged and may represent the next step in the evolution of data management.

4.125

What is high throughput parallel computing, and what does it take to make a grid HTPC-ready? Read on to find out!

4

Researchers have made the grid easier to use  for the Polish grid community by digitizing the process of obtaining X.509 certificates.

3.6
Spotlight

You Might Have Missed

 

Discover how the DNANANO project has been using the Curie supercomputer — a PRACE tier-0 system — to help design nanocages for targeted drug delivery.

Simulating one of these nanocages for just 100 nanoseconds would take...

4

At the recent Internet2 Global Summit iSGTW sat down with George Komatsoulis to talk about the state of distributed research and the NIH Commons, a scalable virtual environment to provide high-performance computing and data storage for bio-...

3.5

Carnegie Mellon University computer scientists looked to Pittsburgh Supercomputing Center supercomputer Blacklight in their construction of Claudico, a poker-playing artificial intelligence. Claudico came up short against the world's best poker...

3.666665