109. The GSBPM recognises several overarching processes that apply throughout the production phases, and across statistical business processes. Some of these overarching processes are listed in Section II. The processes of quality management, metadata management and data management are further elaborated in this Section.

Quality Management

110. Quality concerns organisations, products, sources and processes. In the present framework, quality management overarching process refers to product and process quality. Quality at an institutional level (e.g. adoption of a Quality Policy or Quality Assurance Framework) is considered in the GAMSO.

111. The main goal of quality management within the statistical business process is to understand and manage the quality of the statistical sources, processes and products. There is general agreement among statistical organisations that quality should be defined according to the ISO 9000-2015 standard: “The degree to which a set of inherent characteristics of an object fulfils requirements" 1 . Thus, quality is a complex and multi-faceted concept, usually defined in terms of several quality dimensions. The dimensions of quality that are considered most important depend on user perspectives, needs and priorities, which vary between processes and across groups of users.

112. In order to improve quality, quality management should be present throughout the business process model. It is closely linked to the “Evaluate” phase, however, quality management has both a deeper and broader scope. As well as evaluating iterations of a process, it is also necessary to evaluate separate phases and sub-processes, ideally each time they are applied, but at least according to an agreed schedule. Metadata generated by the different sub-processes themselves are also of interest as an input for process quality management. These evaluations can apply within a specific process, or across several processes that use common components.

In addition, a fundamental role in quality management is played by the set of quality control actions that should be implemented within the sub-processes to prevent and monitor errors and sources of risks. These should be documented, and can be used for quality reporting.

113. Within an organisation, quality management will usually refer to a specific quality framework, and may therefore take different forms and deliver different results within different organisations. The current multiplicity of quality frameworks enhances the importance of the benchmarking and peer review approaches to evaluation, and whilst these approaches are unlikely to be feasible for every iteration of every part of every statistical business process, they should be used in a systematic way according to a pre-determined schedule that allows for the review of all main parts of the process within a specified time period 2 .

114. Broadening the field of application of the quality management overarching process, evaluation of groups of statistical business processes can also be considered, in order to identify potential duplication or gaps.

115. All evaluations result in feedback, which should be used to improve the relevant process, phase or sub-process, creating a quality loop that reinforces the approach to continuous improvements and organisational learning.

116. Examples of quality management activities include:

  • Assessing risks and implementing risk treatments to ensure fit-for-purpose quality;
  • Setting quality criteria to be used in the process;
  • Setting process quality targets and monitoring compliance;
  • Seeking and analysing user feedback;
  • Reviewing operations and documenting lessons learned;
  • Examining process metadata and quality indicators;
  • Internal or external auditing on the process.

117. Quality indicators support a process-oriented quality management. A suggested list of quality indicators for phases and sub-processes of the GSBPM as well as for the overarching quality and metadata management processes can be found at the Quality Indicators for the GSBPM – for Statistics derived from Surveys and Administrative Data Sources 3 . Among others, they can be used as a checklist to identify gaps and/or duplication of work in the organisation.

Metadata Management

118. Metadata has an important role and must be managed at an operational level within the statistical production process. When aspects of metadata management are considered at corporate or strategic level (e.g. there are metadata systems that impact large parts of the production system), it should be considered in the framework of the GAMSO.

119. Good metadata management is essential for the efficient operation of statistical business processes. Metadata are present in every phase, either created, updated or carried forward from a previous phase or reused from another business process. In the context of this model, the emphasis of the overarching process of metadata management is on the creation/revision, updating, use and archiving of statistical metadata, though metadata on the different sub-processes themselves are also of interest, including as an input for quality management. The key challenge is to ensure that these metadata are captured as early as possible, and stored and transferred from phase to phase alongside the data they refer to. Metadata management strategy and systems are therefore vital to the operation of this model, and these can be facilitated by the GSIM.

120. The GSIM is a reference framework of information objects, which enables generic descriptions of the definition, management and use of data and metadata throughout the statistical production process. The GSIM supports a consistent approach to metadata, facilitating the primary role for metadata, that is, that metadata should uniquely and formally define the content and links between information objects and processes in the statistical information system.

121. The METIS Common Metadata Framework identifies the following sixteen core principles for metadata management, all of which are intended to be covered in the overarching metadata management process, and taken into the consideration when designing and implementing a statistical metadata system. The principles are presented in four groups:

Metadata handling

  1. Statistical Business Process Model: Manage metadata with a focus on the overall statistical business process model;
  2. Active not passive: Make metadata active to the greatest extent possible. Active metadata are metadata that drive other processes and actions. Treating metadata this way will ensure they are accurate and up-to-date;
  3. Reuse: Reuse metadata where possible for statistical integration as well as efficiency reasons;
  4. Versions: Preserve history (old versions) of metadata.

Metadata Authority

  1. Registration: Ensure the registration process (workflow) associated with each metadata element is well documented so there is clear identification of ownership, approval status, date of operation, etc.;
  2. Single source: Ensure that a single, authoritative source (“registration authority”) for each metadata element exists;
  3. One entry/update: Minimise errors by entering once and updating in one place;
  4. Standards variations: Ensure that variations from standards are tightly managed/approved, documented and visible.

Relationship to Statistical Cycle / Processes

  1. Integrity: Make metadata-related work an integral part of business processes across the organisation;
  2. Matching metadata: Ensure that metadata presented to the end-users match the metadata that drove the business process or were created during the process;
  3. Describe flow: Describe metadata flow with the statistical and business processes (alongside the data flow and business logic);
  4. Capture at source: Capture metadata at their source, preferably automatically as a by-product of other processes;
  5. Exchange and use: Exchange metadata and use them for informing both computer based processes and human interpretation. The infrastructure for exchange of data and associated metadata should be based on loosely coupled components, with a choice of standard exchange languages, such as XML.


  1. Identify users: Ensure that users are clearly identified for all metadata processes, and that all metadata capturing will create value for them;
  2. Different formats: The diversity of metadata is recognised and there are different views corresponding to the different uses of the data. Different users require different levels of detail. Metadata appear in different formats depending on the processes and goals for which they are produced and used;
  3. Availability: Ensure that metadata are readily available and useable in the context of the users’ information needs (whether an internal or external user).

Data Management

122. Data management is essential as data are produced within many of the activities in the statistical business process and are the key outputs. The main goal of data management is to ensure that data are appropriately used and usable throughout their lifecycle. Managing data throughout their lifecycle covers activities such as planning and evaluation of data management processes as well as establishing and implementing processes related to collection, organisation, use, protection, preservation and disposal of the data.

123. How data are managed will be closely linked to the use of the data, which in turn is linked to the statistical business process where the data are created. Both data and the processes in which they are created must be well defined in order to ensure proper data management.

124. Examples of data management activities include:

  • Establishing a governance structure and assigning data stewardship responsibilities;
  • Designing data structures and associated data sets, and the flow of data through the statistical business process;
  • Identifying database (repositories) to store the data and administration of the database;
  • Documenting the data (e.g. registering and inventorying data, classifying data according to content, retention or other required classification);
  • Determining retention periods of data;
  • Securing data against unauthorised access and use;
  • Safeguarding data against technological change, physical media degradation, data corruption;
  • Performing data integrity checks (e.g. periodic checks providing assurance about the accuracy and consistency of data over its entire lifecycle);
  • Performing disposition activities once the retention period of the data is expired.

  1. ISO 9000:2015, Quality management systems - Fundamentals and vocabulary. International Organization for Standardization
  2. A suitable global framework is the National Quality Assurance Framework developed by a global expert group under the United Nations Statistical Commission (http://unstats.un.org/unsd/dnss/QualityNQAF/nqaf.aspx)
  3. UNECE Statistics Wikis - Quality Indicators for the GSBPM (Quality Indicators)

  • No labels