Quick links


[Common Metadata Framework]

Metadata Case Studies


 All METIS pages (click arrow to expand)
Skip to end of metadata
Go to start of metadata


6.1 Overview of roles and responsibilities

Roles in metadata/statistical lifecycle management


In order to understand the user requirements, we engaged the survey divisions as pilot groups. We involved them in verifying our understanding of the requirements, which was used to design and implement the system. These pilot groups were also involved during User Acceptance Testing (UAT). 

The Survey Metadata Capture Tool can be used by different users depending on the roles that they were assigned. For example, a Capturer could capture metadata but this must be approved by an Approver, who is usually the supervisor or manager. There is also a role of viewer, whereby metadata could be viewed but the rights are restricted. For example, a viewer cannot edit, change or approve metadata. 

The network infrastructure for both development and user environments is supported by the IT department. This includes configuring the environments as well as housing the different servers in the data centre of the organization. The databases are also managed by the IT department. The ESDMF is based on the Linux open source operating system. Because the IT department does not have the skills to service and maintain this environment, we have outsourced these services from a private company. However, this is done in conjunction with the IT department, who are in the process of raising their skill level in order to be able to support the ESDMF in the Linux environment. 

During User Acceptance Testing (UAT) any identified defects were logged on the CA Unicentre system, which is used for IT help desk support. With the help of the IT help desk technicians, we were able to customise the system so that the unique categories of defects for the ESDMF system could be recorded. 

The IT procurement group was used to procure all the hardware and software used in the development and deployment of the system. 

The development of the ESDMF was not done in isolation of the existing projects within Stats SA. For example, the following projects were ongoing and in parallel with the development of the ESDMF:

  • SAS 9 migration
  • Re-engineering of other surveys
  • Community Survey 2007
  • Census 2011

Some members of these other projects were also involved in the development of the requirements and review of the architecture of the ESDMF. The goal is to ensure that we do not do things in isolation so that we can share our knowledge and ease the integration of the new system into existing systems. 

Staff from the Methodology and Standards division was seconded to the ESDMF project. Their role was to develop policies, procedures and standards for the system. Our development process is that policies are developed and approved. Thereafter, the procedures and standards are developed. So, for each phase, the policies are used to develop and implement the system deliverables for that phase. 

For example, for the first phase, we developed a policy for Data Quality and a policy for Metadata. As a result, Phase One was focused on capturing metadata (Metadata policy) in order to ensure quality of the output product (Data Quality policy). For the Second Phase, we already have approved policies for Concepts and Definitions as well as for Classifications.


Description of the team/individuals involved in development and maintenance of metainformation systems.


System Developers

The deliverables expected from the supplier include a Skills Transfer Plan and Strategy. The goal is that the supplier will train Stats SA system developers in how the system is designed and implemented. At the end of the contract, these Stats SA developers should be knowledgeable to maintain, upgrade and/or enhance the system. Thus, we should not be dependent on the supplier for any development beyond the expiry of the contract. 

Data Quality Officers and Specialists

The Data Quality Officers and Specialists are trained on how to use the system. They are also trained to be trainers ("train the trainer"). Once again, the supplier's deliverables includes training Stats SA Data Quality Officers and Specialists in how to train users on how to train other users to be trainers themselves.


Methodology and Standards Professionals

The Methodology and Standards staff members provide support by developing Policies and Standards. They are subject matter experts in survey operations. They are also involved during the design phase in order to help explain and clarify the requirements.


Project Managers

The Stats SA project manager works closely with the supplier's project manager. They bridge the gaps between the two organizations and make sure that the deliverables are managed properly and on time.


6.3 Training and knowledge management

Users are required to spend at least a day in a training session, taking them through the functionality of the system as well as how to use it. 

The Training Manual is used during the training sessions. The Training Manual contains complete descriptions of the system. The users can also use this document for reference purposes. 

The system is designed such that tool tips (online help) are available to the user when hovering over certain areas of the user interface. These tool tips explain the features over which the mouse may be hovering. This allows the user to have information directly at a point of need without having to go through the Training Manual.

6.4 Partnerships and cooperation

In Latvia, we learned that during the development of their system, their outsourced supplier took a while to understand the business of the statistical organization. It came as no surprise when we ran into similar problems with our supplier, as much as we were not happy about it. 

Their Integrated Statistical Data Management System (ISDMS) uses Bo Sundgren's model of metadata system, which they used as a firm foundation for the theoretical definition of metadata. We learned the importance of having a solid foundation in the definition of metadata
In Ireland, we learned about the issues regarding communication between the customer and the supplier. Additionally, they had the same problem as in Latvia in that the development of their system also took longer than originally planned. This happened even after Ireland provided very detailed documentation on most of the major aspects of the system. Once again, when we ran into similar problems, we were not surprised, as much as we did not like it. 

In Slovenia, their metadata model is also based on Bo Sundgren's model, with some modifications in areas where they believe that their components are adequate to meet Bo Sundgren's requirements for a metadata system. 
Their development model is to build the system in-house and outsource when they get to maintenance phase. They continuously re-skill and train their staff as they bring in new technologies aboard. 

From New Zealand, we adopted a few of their practises. For example, we brought in the Statistical Value Chain into Stats SA. This is how we view the business of statistical production processes within Stats SA. We also adopted the way they broke down metadata into five categories, namely, definitional, operational, system, dataset and procedural/methodological metadata. One of their experts helped us to evaluate the respondents to the tender for the development of the ESDMF. 

In our trip to Australia, we learned that in order to have a successful data warehouse project, there is a need to develop policies and standards which will define how the system should be designed. When we returned to South Africa from that trip, we restructured the team into two groups, the Policies and Standards team and the Technology team. The Standards and Policies team developed policies and standards which were used by the Technology team in the development and implementation of the ESDMF. 

Experts from Sweden occasionally came to Stats SA to advise us on various aspects of metadata and statistical production processes. For example, a few years ago, Bo Sundgren, a well known expert on metadata, came to Stats SA to advise us on how to proceed in the development of a metadata system. Recently, another expert from Stats Sweden came to conduct a workshop on SCBDOK, the Stats Sweden metadata template. He also conducted training on quality definition and quality declaration of official statistics. This gave us a better idea on how to develop a data quality template, as well as how data quality should be reported on. 

Last year (2006), we met Alice Born (from Stats Canada) when we attended the METIS conference. We engaged her regarding their development efforts of their metadata system, Integrated Metadata Data Base (IMDB). We applied that knowledge during the development of our Survey Metadata Capturing Tool. 

Consultants from Canada help us in other projects within Stats SA. During their tenure we engage them for advice and other consultation. 

We used the Corporate Metadata Repository (CMR) model by Dan Gillman, from the US Bureau of Statistics in our understanding the metadata model, especially with regard to the ISO 11179 Specification. We also sent our metadata model to him and other metadata experts for review and critique.

6.5 Other issues

Organizational Change Management


Climate and Culture Assessment

Preliminary Organisational Change Management (OCM) initiatives necessitated a review of the operating culture at Stats SA in order to understand the 'lie of the land' in which the system will be introduced. The information contained in the Culture & Climate Assessment was obtained through a number of OCM diagnostic interventions, targeted specifically at internal stakeholders. This was done by holding focus groups as well as running an online survey via Stats SA intranet website. 

A key challenge to Stats SA is to focus the organisation on the strategic importance of the DMID project, not only in as far as it assists an individual in their immediate job function, but even more importantly how it contributes to the overall wellbeing of the South African society at large and the contribution it makes to strategic decision making at government level. DMID communication messages need to create a sense of higher purpose to help individuals with long term strategic thinking.


Change Readiness Assessment

A Change Readiness Assessment was conducted to determine the current capacity of Stats SA to change, and to identify areas of resistance towards DMID requiring Organisation Change Management (OCM) interventions. 

The Change Readiness Assessment was conducted via a survey and series of focus groups. 

The following 'change readiness dimensions' are integral to enable commitment towards DMID and formed the basis of the Change Readiness Assessment: 

  • Clear vision
  • Effective leadership
  • Positive experience with past change initiatives
  • Motivation to do the project
  • Effective communication
  • Adequate project team resources


What is Change Readiness?

OCM is a critical, although often bypassed element in organisations. It focuses on the 'human response to change', helping people understand, accept and commit to a new way of working. One of the key upfront steps in the change process is the Change Readiness Assessment. 


 Figure 13: Change Commitment Curve 

The Change Readiness Assessment is a process used to determine the levels of understanding, acceptance and commitment likely to affect the success of the planned change. Change readiness is gauged along an axis known as the Change Commitment Curve, which is depicted below: 

As the DMID project phases roll out, different stakeholders will need to be at specific levels of commitment. The level of commitment required will be dependent on the role they play in the DMID project and their ability to influence the program. The Change Commitment Curve will provide a framework for understanding and tracking the requisite levels of commitment that stakeholders need to be facilitated through so that OCM interventions can be developed accordingly. 

A Change Readiness Assessment will become an obligatory OCM intervention prior to the rollout of a new phase on the DMID

A Change Readiness Assessment will become an obligatory OCM intervention prior to the rollout of a new phase on the DMID project.



The following were the finding from the assessments:

  • Executive Management does not have the same understanding of the DMID project.
  • Lack of communication between management and sub-ordinates; this makes it difficult for sub-ordinates to understand the purpose of the project and the impact it has on their working lives.
  • Lack of support from Executive management will result to resistance and difficult success of the project
  • If management does not communicate, does not understand, and does not promote the project, it will result in difficulty to deliver the message and get buy-in from staff in the organisation.

Next Steps from the Findings

The findings of the assessments resulted in identifying where some of the key staff members belonged on the Change Commitment Curve. In general, most were in the "Setting the Scene" and "Achieving Acceptance" area bounded by in time by "Contact" ("I know something is changing") and "Understanding" ("I know the implications for me"). Obviously, a lot of effort is needed in order to move from that area to "Achieving Commitment" demonstrated by "Internalisation" wherein staff can claim that "This is the way I do things" 

Another outcome of these assessments was to organize a Leadership Alignment workshop. In this workshop, the Executive Committee was given a presentation of the findings and the path forward. The path forward is to ensure that the leadership understands the goals of the project and how they line up with the vision of Stats SA. The leadership was also instructed on how to communicate the same message about the project.