Big data opens doors previously closed to researchers, yet the volume of data sends scientists looking for analytical tools to bring order from the informational cacophony. Prior to tools like Bioconductor, there were few options for working with quantitative data types; a discordant score for deciphering the human genetic code was the result.
Today, genomic analysis machines create a common language for users, and build a worldwide community to foster developers from among subject matter experts. These instruments make beautiful music from a mass of genomic information.
Seven out of ten academics would be unable to do their work without research software, according to a survey carried out by the UK’s Software Sustainability Institute. 56% of respondents report developing their own software, but many of these researchers have no training in software development.
Physicists can tell the future — or at least foresee multiple possible versions of it. They do this through computer simulations. Simulations can help scientists predict what will happen when a particular kind of particle hits a particular kind of material in a particle detector. But physicists are not the only scientists interested in predicting how particles and other matter will interact. This information is critical in multiple fields, especially those concerned about the effects of radiation. Physicists and other scientists use the GEANT4 toolkit to identify problems before they occur.
Next-generation sequencing (NGS), in which millions or billions of DNA nucleotides are sequenced in parallel, is the backbone of novel discoveries in life sciences, anthropology, social sciences, biomedical sciences and plant sciences. Read about the SoyKB and iPlant collaboration that is taking plant sciences to the next level.
Research output amounts to much more than just academic papers. It is important that datasets, and the software used to analyze them, are also properly cited and that the researchers behind these are given credit for their work. Preserving software and making it citable is also vital in ensuring that scientific results are reproducible by other research groups.
Last week, the results of a survey were published in the journal Scienceshowing that many non-scientific factors often come into play when researchers select software for modeling and other purposes. Could researchers' inability to weigh up the relative pros and cons of the software alternatives available to them based on their scientific merits be undermining the scientific method?
With access to significantly more computational power, researchers can provide more accurate earthquake predictions with the potential to save lives and minimize property damage. Read about advances in developing code to cut both research times and energy costs in simulating seismic hazards.
Calling all citizen scientists. With the exploding availability of data, the need for analysis is steadily becoming a bottleneck in many scientific pursuits. Read about a project aimed at bringing neuroscience to the masses in a way that may surprise and inspire you to take part.
There is a very real and growing disparity between the ability to capture data and the ability to analyze and visualize it, and turn it into usable intelligence. Read about efforts to aid organizations and agencies in making sense of what they see.
Using software to predict how proteins fold at the molecular level, scientists have discovered new information about misfolding and the submolecular level energies involved. Read about the open source software used for simulations, and the potential implications for treatment of degenerative diseases.
How much of an impact does temperature have on biological systems? Scientists at Oregon State University, US, are investigating. A grant from the US National Science Foundation will enable continuous thermal imaging from enzymes to ecosystems.
On Friday 1 February, 2013, CERN and Oracle celebrated 30 years of collaboration. In addition to providing hardware and software to CERN for three decades, Oracle has now been involved in the CERN openlab project for 10 years.
Four creative computer scientists are to be awarded a 'Tech Oscar' next month for creating an algorithm that enables film animators to create realistic smoke effects. It is set to be used in both the upcoming Iron Man 3 and Man of Steel movies.
In their initial phases of research on supernovae, two scientists at the University of Texas at Arlington, US, are trying something new – using SNSPH computer code to develop 3D simulations of a core-collapse supernova evolving into remnants.
Current medical practice lacks the ability to fully assess the risk of rupture for an abdominal aortic aneurysm. Likewise, many key parameters vary widely among people. Using the resources of the Pittsburgh Supercomputing Center, US, Ender Finol is developing computational models that help determine when surgical intervention is necessary.
As exponentally expanding sets of digitized text, audio, and image resources create new opportunities for research and scholarship in the humanities, the ability to visualize and explore these large data sets is critical to research. Find out about one open source tool advancing visualization for the humanities and revealing resolutions and scales never before seen.
Due to the complexity of modern computational science, increasing software errors in code are causing the retraction of research papers in major journals. Now, the RunMyCode project offers a platform to reproduce a published paper's code and data that may be the key to not only reduce errors, but could open the door to better quality science across all research fields.
If you are the kind of person that thinks software development is lifeless and dull, think again. A vivid method of visualizing programmers as they collaborate is taking shape on the web. ‘Gource’ displays software projects as an animated tree that dynamically changes as the respective project does, albeit speeded-up.