Contact person* | . |
---|---|
Job title | |
Telephone | +27-12-310-8911 |
Metadata strategy
.Current situation
High-Level Organisational Structure
Number of Staff: ± 2,000
Figure 1: Stats SA Organization Chart
The Data Management and Information Delivery (DMID) project (magenta shaded box) is located within the Data Management and Technology Division (DMT)
The yellow shaded boxes indicate the ongoing projects that are concurrent with the DMID project.
The following chart shows how the DMID project is structured:
Figure 2: The DMID Project Structure
The following chart show how the DMID project is structured, including the supplier's resources:
Figure 3: DMID Organization Chart
Prescient Business Technologies (PBT) - is the name of the supplier to the DMID project, developing the ESDMF System.
ESDMF - End to end Statistical Data Management Facility.
PM - Project Manager.
Number of staff:
- Stats SA: 1 Project Manager, 1 Technical Lead/Project Manager, 7 Developers, 1 Chief Standards Officer, 6 Data Quality officers/specialists, 3 Methodologists, 1 Systems Analyst, DBA (as needed), Network Support Technician (as needed) (20 total - excluding "as needed")
- PBT: 1 Project Manager, 1 Technical Lead (50%), 1 Architect, 1 Business Analyst (50%), 1 Release Coordinator, 1 Trainer, 1 Organisational Change Management Lead, 3 Developers, 1 Account Manager (10 total)
Strategy
Statistics South Africa's development of the metadata management system has its origins in the organisation's requirement to develop a data warehouse. The idea of a data warehouse came about because the organisation wanted to improve the quality of the statistics produced. It was believed that the data warehouse would play a major role in positioning the organisation within its vision of becoming the "preferred supplier of quality statistics". To begin our data warehouse initiative, we paid exploratory visits to various statistical organisations that had embarked on data warehouse developments in order to learn from their experiences. These visits taught us that a number of things about the complexities, difficulties and peculiarities of developing a data warehouse. In particular, our visit to the Australian Bureau of Statistics showed us that for a data warehouse to have any chance of succeeding in a statistical organisation, it needs to have a strong foundation of standards and policies that govern the statistical production processes. Standardisation of concepts and their definitions, as well as classifications of the terms of the actual survey process, were all found to be necessary for the production of quality statistics. For it to be successful, a data warehouse also needs to operate in this environment.
A formal process for standardization was developed through consultation with standards experts standards development and implementation lifecycle was developed to monitor the standardization process. The following is the standards development lifecycle.
Figure 4: Standards Lifecycle
The next step for us was to investigate the strength of our standards and policy foundation. Upon this investigation, a number of gaps were identified. Chief among these was the lack of standard metadata in the organisation. The need for standardisation of metadata necessitated the development of a metadata management system. However, this had to form a good mix with all the other identified ingredients necessary for the production of quality statistics.
Strategically, our metadata management system forms part of a larger system of applications called the End-to-end Statistical Data Management Facility (ESDMF). As an end-to-end system, the ESDMF will consist of tools and applications to support the whole statistical production process. Within this facility exists a metadata subsystem (refer to figure 5), which plays a central role as the ESDMF was conceived to be metadata driven. In a statistical organisation, a metadata driven system is inevitable because metadata is used and generated at every stage of the statistical production process.
Figure 5: Conceptual components of the ESDMF
As a data factory, a statistical organisation needs to organise and package data in ways that make it useful to the end user. Produced data must also meet certain minimum quality standards. To satisfy both these requirements, use of metadata is invoked. In packaging its data and statistical products, a statistical organisation must ensure that they are attached with metadata for ease of analysis and interpretation by their users. Metadata also play a key role in ensuring that the end products of this data factory are of good quality. Such metadata includes descriptions of concepts used in the organisation, classifications of these concepts, methodologies and business rules. These are all necessary metadata to ensure that products are of good quality.
The development of a metadata management system was informed by the following principles:
- Maintenance of trust in official statistics: Descriptions of data collection methods, data processing, and storage needed form part how statistical data are presented to the end user. When presented like this, statistical data and products engender trust to the users.
- Facilitation of correct interpretation of statistical data: Metadata accompanying datasets and other statistical products.
- Quality of statistics: Standard metadata contributes to the improvement of a number of quality dimensions. Standardisation of concepts and their definitions and classifications are essential ingredients of standardized metadata.
Programme Providing Frame for Stats SA Projects
The work of all Stats SA components is mapped out in the organisation's Work Programme. Organisational units must support the following strategic themes to advance the work of the organisation:
- Providing Relevant Statistical Information to meet user Needs
- Enhancing the Quality of Products and Services
- Developing and Promoting Statistical Coordination and Partnerships
- Building Human Capacity
This project is aimed at supporting the strategic theme "Enhancing the Quality of Products and Services". Within the DMID project, the metadata management system, more than any of its components, addresses this strategic theme.
Overall Project Objective
Statistics South Africa's metadata management system therefore forms part of the organisation's broader objective to continuously improve the quality of its products. As the driver of the overall facility, the metadata management system is the first deliverables of the DMID project. The metadata management system is also divided into smaller logical units based on the organisation's classification of its metadata. Survey metadata, consisting of elements for providing the overall description of a statistical survey is the first of these metadata deliverables. The survey metadata component is fashioned along the lines of Statistics Canada's Integrated Metadata Database (IMDB) Metastat.
Following the survey metadata component will be the definitional metadata component. This will incorporate into the metadata management system the standardised organisation-wide concepts and their definitions and classifications as well as other components that form part of definitional metadata.
Metadata Classification
The essence of Stats SA's meta-information system is captured by how the organisation uses the metadata. Metadata is used internal to the organisation to enable statistical production processes. This means that metadata is used during various stages of statistical production as essential input to production processes. However, the production processes in turn, produce metadata. This metadata is also important in documenting the trail of activities during the statistical production process. The documentation of production activities informs related metadata issues such as the assessment of data quality and its interpretation.
Categories of Metadata
Because of this diversity of metadata usage, it was decided that contents of the meta-information system should be aligned with these usage activities. The natural progression of this decision was to undertake a project to classify all of the organisation's metadata. The following is a list of the categories of metadata adopted by Stats SA:
- Survey Metadata
Often referred to as dataset metadata, Survey metadata is used to describe, access and update dataset, data structures. Stats SA chose to call this type of metadata survey rather than dataset because some of the metadata such as information about "the population which the data describe" refer to the broader aspects of the survey, and not only the dataset.
- Definitional Metadata
This is metadata describing the concepts used in producing statistical data. These concepts are often encapsulated into measurement variables used to collect statistical data. Descriptive text is used to define individual concepts, however the concepts are further grouped into logical topics. These main topics are effectively classifications of data. Hence, included in Stats SA's package of definitional metadata classifications drawn from different study domains.
- Methodological Metadata
These metadata relate to the procedures by which data are collected and processed. These may include Sampling, Collection methods, Editing processes, etc
- System Metadata
System metadata refers to active metadata used to drive automated operations. Some of the examples of system metadata are:
- Publication or dataset identifiers date of last update
- File size
- Mapping between logical names and physical names of files
- Dataset input flows
- Access methods to databases
- Coordinates as kept in metadata store
- Table and column definitions schema and mappings of data
- Operational Metadata
This is metadata arising from and summarising the results of implementing the procedures. Examples include Respondent burden, Response rates, Edit failure rates, Costs and other quality and performance indicators, etc
The different components of Stats SA's meta-information system are logically grouped according to these categories of metadata. This means that the database for the meta-information system has different data structures corresponding to these metadata categories. We have recently (June 2007) finished developing the first metadata component, the survey metadata capturing tool, which is the subject of this case study.
How Metadata Fit into Other Organisational Systems
As already stated, the development of Statistics South Africa's metadata management system (Meta-Information system) is part of a larger system, the ESDMF. The central components of the ESDMF will follow the completion of the meta-information system, because the ESDMF is driven by the metadata. Although the ESDMF is a new system, it is merely a means to centralize the organisation's disparate statistical information systems. Figure 6 below shows the conceptual ESDMF subsystems and how they are placed relative to other organizational subsystems. The metadata subsystem supports the entire statistical cycle.
Figure 6: Conceptual components for the ESDMF in relation to other subsystems
Metadata system(s)
.Costs and Benefits
.Implementation strategy
.IT Architecture
Overview of Stats SA's IT Architecture
The Stats SA's IT environment, within which the ESDMF is developed, requires systems to adhere to the following architectural principles:
- Integration
The system must integrate with other organizational systems. API's will be built for various applications that need to connect to the ESDMF. However, most of the connection is expected to be at a data level. With the exception of SAS, the organisation uses relational databases. Integration at this level is attained using ODBC connection. SAS supports ODBC and in addition to that, has native support for various databases.
- Interoperability
To ensure interoperability, the ESDMF uses Java as a development standard because of its platform independence. The development of the system as a web application also means that only a web browser is needed to access the application.
- Modularity
The development of all the components of the ESDMF is based on the organisational requirement for building modular systems that allow ease of management and flexibility. The metadata management system is modularised according to the different categories of metadata.
- Scalability
Stats SA's computer applications have to be built such that they can scale up to accommodate the inevitability of growth of an organization. Both the database designs and storage hardware for all the components of the ESDMF are developed to cater for such growth.
- Flexibility
Applications must meet the diverse needs of Stats SA. These needs change with time, and new ones are also discovered. Development of flexible applications that may be easily changed or added to is vital. Part of the insistence on the use of object oriented programming was informed by the need for flexibility. This will minimize "spaghetti programming" associated with large software projects.
IT Infrastructure Specification
The metadata management system is deployed in an IT infrastructure with a set of minimum specifications. These minimum specifications list the hardware items needed to run the system without going into details of the hardware items themselves.
- Operating System(s)
Desktops are in Microsoft Windows. The application is deployed in an Open Source operating system (Novell SuSe Linux).
- Computer Network
The network architecture is based on open protocols and industry standards. It allows remote access to some employees. This supports both local area (LAN) and wide area (WAN) networks.
- Computer Servers
The system is developed as a client-server application. This means that there is a need for powerful computer servers capable of handling intensive processing.
- Storage
Because of the vastness of data to be generated and/or captured in the system, there is need for a well-managed storage system. The Storage Area Network (SAN) is the technology used at Stats SA to provide storage management.
A. Development Environment |
|
|
|
Function | Make/Model | Operating System/ | Comment |
Application Server | HP BL45p | SuSe Linux Ver. 10 | Make/Model exceeds recommendation |
Database Server | HP BL45p | Oracle 10g or | Make/Model exceeds recommendation |
Build Server | HP DL 320 | SuSe Linux Ver. 10 | Make/Model exceeds recommendation |
B. User Acceptance Test (UAT) Environment |
|
|
|
---|---|---|---|
Application Servers | 2 x HP BL45p | SuSe Linux Ver. 10 | Make and model exceeds recommendation |
Database Servers | 2 x HP BL45p | Oracle 10g or |
|
C. Production Environment |
|
|
|
Application Servers | 2 x HP BL45p | SuSe Linux Ver. 10 | Make and model exceeds recommendation |
Database Servers | 2 x HP BL45p | Oracle 10g or |
|
Table 4: Hardware and software specifications for the ESDMF infrastructure
Figure 10: Hardware and software specifications for the ESDMF infrastructure
Components of Metadata Management Application
The application is web-based and developed in Java. Tomcat is used to implement Java Servlet API and HTTP functionality. The following are physical divisions of the application:
- User Interface (UI)
The user interfaces for all the metadata management system applications is web-based. This allows us to quickly deploy the tool to users in the organization. Client workstations only need to have a web-browser to access server based applications. The main supported web-browsers are Microsoft Internet Explorer and Firefox.
- Database
The application is supported by a relational database management system (RDBMS). Stats SA uses a variety of RDBMS engines. The RDBMS engine of choice for this project is Sybase 12.5.x. The project is currently using the open source RDBMS, MySQL.
- Business logic
The business logic controlling the interaction between the UI and the underlying database is coded using Java server side scripting. There is also business logic coded using stored procedures. This mostly performs housekeeping within the database.
- Application/Web Server
The application is served to the client via Tomcat, which processes Java code. Tomcat also handles HTTP calls from the web browser.
Metadata Management Tools
The developed metadata management application allows Stats SA staff members to perform a number of tasks in the metadata management process. The application groups these tasks into three modules or tools.
- Administration module
This module is used to manage users of the system, make changes to certain categories of captured metadata and other housekeeping activities. The administration module will also be used to administer other categories of metadata.
- Metadata Capturing and Editing module
The survey metadata will be continually captured by the originating components whenever an instance of a given survey is required. The metadata captured here is specific to the instance of a survey. This module allows the users to capture and edit survey metadata into the system. A special user role, the Approver, is given permissions to approve all the captured survey metadata, at which point it is exposed for use in the organisation.
- Query and Reporting Module
The metadata repository is query-able and therefore can be reported on. A metadata report is used as one of the ways to document survey data. This may happen in two situations. In the first situation, an internal user may want to view captured metadata. Producing a report of this metadata provides a structured way of viewing this metadata. Another way of viewing metadata is to use the "View Metadata" functionality of the Metadata Capturing and Editing module.
Figure 11: Different modules of the tool
Figure 12: Survey information page with navigation on the right hand side
How Meta-Information System Integrates to Other Stats SA Applications
Although this feature has not been implemented yet, the metadata management system, like the rest of the ESDMF, is designed link with other statistical processing applications and data repositories. In the immediate future, the repository allows access via the following two methods:
- ODBC Connection by SAS
SAS can extract metadata from the repository for use as input to data processing and analysis activities of statistical production. At this stage, an Open Database Connectivity (ODBC) connection will provide SAS with ability to access the database. When our database is migrated from MySQL to Sybase, there will be an option to use SAS Access to Sybase.
- APIs
Application Programming Interfaces (APIs) will be developed for each application that needs to exchange information with the metadata system. At this initial stage of the project no application uses the metadata management system in this way and therefore no API has yet been developed.
Standards and formats
.Version control and revisions
Metadata is expected to change due to revisions of concepts and their definitions, changes to classifications, business rules and user requirements. Sometimes more than one version of certain metadata used for the same purpose may exist at the same time. In the current Survey Metadata tool the "Edit" functionality of the application allows for the revision of captured Survey metadata. These revisions may only be performed by users with requisite permissions. For changes to be effected, revised/edited metadata must be approved by an assigned Approver. Survey metadata can only have a single version. This means that the Edit process serves to update the metadata repository. Version control will be introduced when metadata categories with metadata that can have more than one version are incrementally built into the system. It is important to note that version control will be built into every aspect of the ESDMF.Outsourcing versus in-house development
The development of all of the ESDMF, including the metadata management system, is outsourced. Two issues influenced the decision to outsource. These were: the fact that Stats SA does not have enough skilled resources and the need to have views which would not be obscured by prior opinions of a statistical environment. This scenario requires that the outsourced resources invest a lot of time in understanding the organisation and analysing the requirements. It is important to note that we conducted two stages of outsourcing. In the first stage we outsourced the task of gathering the requirements for the whole of the ESDMF. These requirements contain details of each of the components of the ESDMF, including the metadata management system. The second stage is the development of the system. The two tasks were done by two different organisations. This separation of tasks was done in order to maintain the focus on requirements gathering. In this development model, the development team mainly verifies existing requirements.Sharing software components of tools
.Overview of roles and responsibilities
Roles in metadata/statistical lifecycle management
In order to understand the user requirements, we engaged the survey divisions as pilot groups. We involved them in verifying our understanding of the requirements, which was used to design and implement the system. These pilot groups were also involved during User Acceptance Testing (UAT).
The Survey Metadata Capture Tool can be used by different users depending on the roles that they were assigned. For example, a Capturer could capture metadata but this must be approved by an Approver, who is usually the supervisor or manager. There is also a role of viewer, whereby metadata could be viewed but the rights are restricted. For example, a viewer cannot edit, change or approve metadata.
The network infrastructure for both development and user environments is supported by the IT department. This includes configuring the environments as well as housing the different servers in the data centre of the organization. The databases are also managed by the IT department. The ESDMF is based on the Linux open source operating system. Because the IT department does not have the skills to service and maintain this environment, we have outsourced these services from a private company. However, this is done in conjunction with the IT department, who are in the process of raising their skill level in order to be able to support the ESDMF in the Linux environment.
During User Acceptance Testing (UAT) any identified defects were logged on the CA Unicentre system, which is used for IT help desk support. With the help of the IT help desk technicians, we were able to customise the system so that the unique categories of defects for the ESDMF system could be recorded.
The IT procurement group was used to procure all the hardware and software used in the development and deployment of the system.
The development of the ESDMF was not done in isolation of the existing projects within Stats SA. For example, the following projects were ongoing and in parallel with the development of the ESDMF:
- SAS 9 migration
- Re-engineering of other surveys
- Community Survey 2007
- Census 2011
Some members of these other projects were also involved in the development of the requirements and review of the architecture of the ESDMF. The goal is to ensure that we do not do things in isolation so that we can share our knowledge and ease the integration of the new system into existing systems.
Staff from the Methodology and Standards division was seconded to the ESDMF project. Their role was to develop policies, procedures and standards for the system. Our development process is that policies are developed and approved. Thereafter, the procedures and standards are developed. So, for each phase, the policies are used to develop and implement the system deliverables for that phase.
For example, for the first phase, we developed a policy for Data Quality and a policy for Metadata. As a result, Phase One was focused on capturing metadata (Metadata policy) in order to ensure quality of the output product (Data Quality policy). For the Second Phase, we already have approved policies for Concepts and Definitions as well as for Classifications.
Description of the team/individuals involved in development and maintenance of metainformation systems.
System Developers
The deliverables expected from the supplier include a Skills Transfer Plan and Strategy. The goal is that the supplier will train Stats SA system developers in how the system is designed and implemented. At the end of the contract, these Stats SA developers should be knowledgeable to maintain, upgrade and/or enhance the system. Thus, we should not be dependent on the supplier for any development beyond the expiry of the contract.
Data Quality Officers and Specialists
The Data Quality Officers and Specialists are trained on how to use the system. They are also trained to be trainers ("train the trainer"). Once again, the supplier's deliverables includes training Stats SA Data Quality Officers and Specialists in how to train users on how to train other users to be trainers themselves.
Methodology and Standards Professionals
The Methodology and Standards staff members provide support by developing Policies and Standards. They are subject matter experts in survey operations. They are also involved during the design phase in order to help explain and clarify the requirements.
Project Managers
The Stats SA project manager works closely with the supplier's project manager. They bridge the gaps between the two organizations and make sure that the deliverables are managed properly and on time.
Metadata management team
.Training and knowledge management
Users are required to spend at least a day in a training session, taking them through the functionality of the system as well as how to use it. The Training Manual is used during the training sessions. The Training Manual contains complete descriptions of the system. The users can also use this document for reference purposes. The system is designed such that tool tips (online help) are available to the user when hovering over certain areas of the user interface. These tool tips explain the features over which the mouse may be hovering. This allows the user to have information directly at a point of need without having to go through the Training Manual.Partnerships and cooperation
In Latvia, we learned that during the development of their system, their outsourced supplier took a while to understand the business of the statistical organization. It came as no surprise when we ran into similar problems with our supplier, as much as we were not happy about it. Their Integrated Statistical Data Management System (ISDMS) uses Bo Sundgren's model of metadata system, which they used as a firm foundation for the theoretical definition of metadata. We learned the importance of having a solid foundation in the definition of metadata In Ireland, we learned about the issues regarding communication between the customer and the supplier. Additionally, they had the same problem as in Latvia in that the development of their system also took longer than originally planned. This happened even after Ireland provided very detailed documentation on most of the major aspects of the system. Once again, when we ran into similar problems, we were not surprised, as much as we did not like it. In Slovenia, their metadata model is also based on Bo Sundgren's model, with some modifications in areas where they believe that their components are adequate to meet Bo Sundgren's requirements for a metadata system. Their development model is to build the system in-house and outsource when they get to maintenance phase. They continuously re-skill and train their staff as they bring in new technologies aboard. From New Zealand, we adopted a few of their practises. For example, we brought in the Statistical Value Chain into Stats SA. This is how we view the business of statistical production processes within Stats SA. We also adopted the way they broke down metadata into five categories, namely, definitional, operational, system, dataset and procedural/methodological metadata. One of their experts helped us to evaluate the respondents to the tender for the development of the ESDMF. In our trip to Australia, we learned that in order to have a successful data warehouse project, there is a need to develop policies and standards which will define how the system should be designed. When we returned to South Africa from that trip, we restructured the team into two groups, the Policies and Standards team and the Technology team. The Standards and Policies team developed policies and standards which were used by the Technology team in the development and implementation of the ESDMF. Experts from Sweden occasionally came to Stats SA to advise us on various aspects of metadata and statistical production processes. For example, a few years ago, Bo Sundgren, a well known expert on metadata, came to Stats SA to advise us on how to proceed in the development of a metadata system. Recently, another expert from Stats Sweden came to conduct a workshop on SCBDOK, the Stats Sweden metadata template. He also conducted training on quality definition and quality declaration of official statistics. This gave us a better idea on how to develop a data quality template, as well as how data quality should be reported on. Last year (2006), we met Alice Born (from Stats Canada) when we attended the METIS conference. We engaged her regarding their development efforts of their metadata system, Integrated Metadata Data Base (IMDB). We applied that knowledge during the development of our Survey Metadata Capturing Tool. Consultants from Canada help us in other projects within Stats SA. During their tenure we engage them for advice and other consultation. We used the Corporate Metadata Repository (CMR) model by Dan Gillman, from the US Bureau of Statistics in our understanding the metadata model, especially with regard to the ISO 11179 Specification. We also sent our metadata model to him and other metadata experts for review and critique.Other issues
Organizational Change Management
Climate and Culture Assessment
Preliminary Organisational Change Management (OCM) initiatives necessitated a review of the operating culture at Stats SA in order to understand the 'lie of the land' in which the system will be introduced. The information contained in the Culture & Climate Assessment was obtained through a number of OCM diagnostic interventions, targeted specifically at internal stakeholders. This was done by holding focus groups as well as running an online survey via Stats SA intranet website.
A key challenge to Stats SA is to focus the organisation on the strategic importance of the DMID project, not only in as far as it assists an individual in their immediate job function, but even more importantly how it contributes to the overall wellbeing of the South African society at large and the contribution it makes to strategic decision making at government level. DMID communication messages need to create a sense of higher purpose to help individuals with long term strategic thinking.
Change Readiness Assessment
A Change Readiness Assessment was conducted to determine the current capacity of Stats SA to change, and to identify areas of resistance towards DMID requiring Organisation Change Management (OCM) interventions.
The Change Readiness Assessment was conducted via a survey and series of focus groups.
The following 'change readiness dimensions' are integral to enable commitment towards DMID and formed the basis of the Change Readiness Assessment:
- Clear vision
- Effective leadership
- Positive experience with past change initiatives
- Motivation to do the project
- Effective communication
- Adequate project team resources
What is Change Readiness?
OCM is a critical, although often bypassed element in organisations. It focuses on the 'human response to change', helping people understand, accept and commit to a new way of working. One of the key upfront steps in the change process is the Change Readiness Assessment.
Figure 13: Change Commitment Curve
The Change Readiness Assessment is a process used to determine the levels of understanding, acceptance and commitment likely to affect the success of the planned change. Change readiness is gauged along an axis known as the Change Commitment Curve, which is depicted below:
As the DMID project phases roll out, different stakeholders will need to be at specific levels of commitment. The level of commitment required will be dependent on the role they play in the DMID project and their ability to influence the program. The Change Commitment Curve will provide a framework for understanding and tracking the requisite levels of commitment that stakeholders need to be facilitated through so that OCM interventions can be developed accordingly.
A Change Readiness Assessment will become an obligatory OCM intervention prior to the rollout of a new phase on the DMID
A Change Readiness Assessment will become an obligatory OCM intervention prior to the rollout of a new phase on the DMID project.
Findings
The following were the finding from the assessments:
- Executive Management does not have the same understanding of the DMID project.
- Lack of communication between management and sub-ordinates; this makes it difficult for sub-ordinates to understand the purpose of the project and the impact it has on their working lives.
- Lack of support from Executive management will result to resistance and difficult success of the project
- If management does not communicate, does not understand, and does not promote the project, it will result in difficulty to deliver the message and get buy-in from staff in the organisation.
Next Steps from the Findings
The findings of the assessments resulted in identifying where some of the key staff members belonged on the Change Commitment Curve. In general, most were in the "Setting the Scene" and "Achieving Acceptance" area bounded by in time by "Contact" ("I know something is changing") and "Understanding" ("I know the implications for me"). Obviously, a lot of effort is needed in order to move from that area to "Achieving Commitment" demonstrated by "Internalisation" wherein staff can claim that "This is the way I do things"
Another outcome of these assessments was to organize a Leadership Alignment workshop. In this workshop, the Executive Committee was given a presentation of the findings and the path forward. The path forward is to ensure that the leadership understands the goals of the project and how they line up with the vision of Stats SA. The leadership was also instructed on how to communicate the same message about the project.
Lessons learned
The supplier had a difficult time understanding the business of Stats SA, which is statistical production processes. Additionally, the goal of the project is to improve quality, which will help support the vision of Stats SA "to be the preferred supplier of quality statistics". Even in the face of this vision, the supplier failed to recognize that quality was a primary business objective. Under pressure of meeting the deliverables, the supplier ignored the Skills Transfer Plan, with the result that the Stats SA developers were not involved in the final design and development of the system. For a project of this magnitude (three years), we decided to break down the deliverables into twelve phases. Each phase was planned to be three months long in duration. Also, each phase was planned to be a complete deliverable in its own right, even though the next phase was planned to build on the previous phases. The first phase was delivered late mainly due to the lack of understanding that the supplier demonstrated. The key is that clear understanding of the requirements is very important in meeting the deliverables as well as milestones for those deliverables.Links: |
---|