Jump to Navigation

Data-intensive research

Big Data is news

Big data, the collection and analysis of large amounts of data is in the news:

The tide is high

hlg.jpg

In the US, 2.5 petabytes of data are stored annually just for mammograms.

The volume of earth-observation data from the European Space Agency's satellites passed three petabytes in 2007. The projection for 2020? A seven-fold rise.

Riding the Wave ... or swamped by a tsunami?

kesurf.jpg Knowledge Exchange's A Surfboard for Riding the Wave report calls for a collaborative data infrastructure to enable researchers to use, re-use and exploit research data to benefit scientific research, and society more generally.

Sip some NecTAR

smnectar.jpgThe Federal Government has funded a number of initiatives aimed at boosting Australian research infrastructure. The $50m Research Data Storage Infrastructure project is one, and the $47m National eResearch Collaboration Tools and Resources (NecTAR) project is another.

Take up the challenge

If you had to nominate where money would be best spent supporting data-intensive science, what would you suggest? If you have a good answer, the Gordon and Betty Moore Foundation's Science Program want to hear about it. They have money and they want to spend it wisely, so they have launched the Data Intensive Science Request for Ideas. Users can register their own idea, and also vote on the suggestions of others. Based in San Francisco, the Gordon and Betty Moore Foundation is a private grant-making body.

Can't share, won't share?

PlosOnelogo.jpg Data sharing has always been part of scientific method, since it allows other researchers to verify and build on existing results.

From little things, big things grow

fourthparadigm.jpg"We have to do better at producing tools to support the whole research cycle-from data capture and data curation to data analysis and data visualization. Today, the tools for capturing data both at the mega-scale and at the milli-scale are just dreadful.