Page tree

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.

This paper, written by Antonino Virgillito, Monica Scannapieco and Diego Zardetto of ISTAT for the 2013 New Techniques and Technologies for Statistics conference, Brussels, focuses in particular on the different kinds of challenges posed by the use of Big Data in official statistics: storage, processing, quality, timing and access.  


"Big Data refers to data sets that are impossible to store and process using common software tools, regardless of the computing power or the physical storage at hand. The Big Data phenomenon is rapidly spreading in several domains including healthcare, communication, social sciences and life sciences, just to cite some relevant examples. However, the opportunities and challenges of Big Data in Official Statistics are matter of an open debate that involves both statisticians and IT specialists of National Statistical Institutes.

In this paper, we first analyse the concept of Big Data under the Official Statistics perspective by identifying its potentialities and risks. Then, we provide some examples that show how NSIs can deal with such a new paradigm by adopting Big Data technologies, on one side, and rethinking methods to enable sound statistical analyses on Big Data, on the other side."

The full paper can be found at

Report inappropriate content