Share |

7 December 2011

For the last 40 years, the way that large-scale services, such as global banks, and scientific experiments, such as the LHC at CERN, have been managing their data has been reminiscent of Lyman Frank Baum’s The Wonderful Wizard of Oz:there's only one tried-and-tested path, a 'yellow brick road', and it's relational databases. But over the last couple of years, some new, non-relational database models have emerged and may represent the next step in the evolution of data management.

4.333335

What is high throughput parallel computing, and what does it take to make a grid HTPC-ready? Read on to find out!

4

Researchers have made the grid easier to use  for the Polish grid community by digitizing the process of obtaining X.509 certificates.

3.6
Spotlight

You Might Have Missed

 

iSGTW recently attended the ISC Cloud '14 conference in Heidelberg, Germany. The event focused on the intersection between cloud and high-performance computing (HPC), with in-depth first-hand reports on the latest technological...

4.25

ESnet is deploying four new high-speed transatlantic links to London, Amsterdam, and Geneva. The high-speed links will give researchers at US universities and national laboratories ultra-fast access to scientific data from the Large Hadron Collider...

4.5

The US National Science Foundation aims to improve the nation's capacity in data science by investing in infrastructure development, increasing the number of data scientists, and augmenting the usefulness and ease of using data. Read about the...

2.333335

For scientists looking to complete large, complex, data-driven research projects quickly, living cyberinfrastructure can be a powerful solution. This is a different way of working for most scientists; applying for time on a machine does not...

3.5