Skip to end of metadata
Go to start of metadata



Sharing experiences on methods includes evaluation, efficiency and performance studies, etc. To do this, the UNECE general framework on evaluating efficiency of statistical data editing can be used as a basis. Papers documenting studies should be made available through this page.


Statistical agencies spend large amount of resources in evaluating and developing applications based on specific systems. The idea of sharing good and bad experiences with specific products would help many users. For any specific product, typical users would be interested to learn from colleagues about:

  • the technical and statistical expertise they needed to initiate a first application,
  •  the money and human investment required in setting up the hardware/software environment,
  •  the installation of the product itself,
  •  the number of employees involved in developing the application(s),
  •  the required training with the product and/or with any foundation software,
  •  the sources of data being processed,
  •  the volume of data being processed,
  •  the development of pre-processors and post-processors,
  •  the changes that had to be done to other systems to integrate the product into a data stream,
  •  the methods that had to be modified because of the available functionality,
  •  the turn around time for the investigation,
  •  the difficulties encountered during the development and the production phases,
  •  the overall comparison with former systems, and any comments related to the evaluation criteria listed on this site.
  • No labels