Share |

7 December 2011

For the last 40 years, the way that large-scale services, such as global banks, and scientific experiments, such as the LHC at CERN, have been managing their data has been reminiscent of Lyman Frank Baum’s The Wonderful Wizard of Oz:there's only one tried-and-tested path, a 'yellow brick road', and it's relational databases. But over the last couple of years, some new, non-relational database models have emerged and may represent the next step in the evolution of data management.

4.333335

What is high throughput parallel computing, and what does it take to make a grid HTPC-ready? Read on to find out!

4

Researchers have made the grid easier to use  for the Polish grid community by digitizing the process of obtaining X.509 certificates.

3.6
Spotlight

You Might Have Missed

 

The shock financial crisis that started in 2007 provided a vivid demonstration of the unstable nature of our global financial system. Yet, academics from the Zurich University of Applied Sciences in Switzerland argue that a lack of globally-agreed...

4.666665

After serving its original purpose, data is often set aside to live out the rest of its days in obscurity. It's lying around in all sorts of formats — hand-written notes, scanned images, unstructured databases — up to 90% of which...

4.5

Hosted at CERN, UNITAR’s UNOSAT program examines global satellite imagery for humanitarian use. Whether they're providing maps for disaster response teams or assessing conflict damage to help reconstruction, their detailed reports are...

3.5

Professor Stephan Roche of the Catalan Institute of Nanoscience and Nanotechnology in Spain is one of the early adopters of high-performance computing in Europe’s Graphene Flagship initiative. His latest project has received 22 million core...

4.333335