Error rendering macro 'navitabs'

com.atlassian.renderer.v2.macro.MacroException: The root page with the name '@parent' does not exist in space with key 'Collection'!

The most important suggested Topics for Future Work that were identified by the participants (by order of votes):

  • Mobile devices for data collection (increases response rate, but is it a different mode and should we adapt the design and/or shorten the questionnaire)
  • Skills and mind-set needed for our future data collection methods
  • Paradata: how to structure, understand and us it
  • New methods such as integration sources and sampling techniques to reduce survey burden
  • Design shorter questionnaires (what do we really need, identify key questions, maximise result while minimise size, paradigm change of staff)
  • Use of nudging initiatives in communicating with respondents
  • Measure the impact of personalized feedback (cost-benefit)
  • How to integrate Big Data and other new sources into integrated data management systems
  • Behavioural studies for design and communication strategies
  • Improve partnerships with data providers
  • Measure the effectiveness of incentives and their impact on data quality

Detailed Lessons Learned and Topics for Future Work by session (click to expand)

Lessons Learned

Lessons Learned:

  • When using CAWI, it is important that questions are interpreted correctly.
  • Prefilling CAWI forms raises privacy issues.
  • CAWI is a multi-mode approach.
  • Support to survey respondents outside regular office hours important as most respond in after hours.
  • Consistency in the look and feel of web questionnaires across devices and browsers is important to guarantee comparable results and data quality.
  • Device impact analysis can uncover issues within the survey.
  • It is important to integrate mode effects into the questionnaire design from the beginning and to allow for multiple languages.
  • Research should be invested into software features for smart phones.
  • The Minimum Viable Product (MVP) approach is an agile and pragmatic way of starting online surveys at an early stage.
  • Moving away from stovepipes leads to increased complexity of processes.
  • Don’t be afraid to try different things.

Lessons Learned:

  • We need to balance getting info on the survey burden without adding a burden.
  • It is valuable to measure and understand perceived response burden and the use of paradata is important for example for monitoring real time non-response.
  • Streamline the evaluation and review process to reduce the burden.
  • Segmentation of the burden according to the dropout point is useful.
  • The higher value we give back to respondents, the lower the perceived burden.
  • Giving information back to businesses and automating response or use of personal data can increase response rates (but has privacy issues).
  • There are different solutions to dropping response rates and for we need to design different solutions for different devices.
  • Use mixed-mode and mixed source and integrate it with existing data sources.
  • Before starting a survey, look first at alternative existing data sources and only use new survey for complementary information.
  • Mixed sources across countries is still difficult or impossible unless harmonized concepts are used.
  • Web scraping is an interesting technique that can both improve data quality and reduce the response burden.
  • It is important to develop in-house skill sets to maximise the value of using new technologies.

Lessons Learned:

  • There is value in an integrated data approach in providing insights but it is also a challenge to use before developing a new data collection.
  • It is critical that data collection departments are involved in data integration.
  • Using small building blocks when defining variables and entities and use it in an integrated approach.
  • Paradata offers great opportunities to more fully validate fieldwork and to improve data quality.
  • It is not always easy to capture useful information out of paradata; we need insight and proper technology to use it.
  • Off the shelve solutions should be considered as alternative for developing in house applications.
  • One has to develop not just methods and technology but also foster teambuilding and the way staff thinks.
  • Institutional culture and investing in staff is very important and change required should not be underestimated.
  • Organisational restructuring should be considered alongside changing the system.

Lessons Learned:

  • Nudging approaches are interesting; they make it easy to respond correctly and timely.
  • Electronic data collection needs good nudging and has to be adapted to the mode used.
  • Low response does not necessarily mean unwillingness to respond, there are more factors involved.
  • Training and sensitising interviewers to deal challenging subgroups is important.
  • Collaborating with interest groups (e.g. trade organisations, charity champions, ethnic groups) to get support improves response rate and quality.
  • Feedback to data providers is valuable in reinforcing the value of statistics, but care is needed to prevent negative effects.
  • Providing personalised data, to both the person who responds as well as the managers, is a great idea to improve response and for building collaborative relationships with data providers.
  • Communication strategies should be designed in a structured and simple way.


Lessons Learned and Topics for future work (main messages from presentations, no small group discussions were held):

  • It is important to define the value of official statistics and to brand our products to users and stakeholders in order to defend our value proposition.
  • We need a measurement framework and key indicators to measure the value of official statistics.
  • We need customer focused, innovative, collaborative official statistics designed to international best practice and share good practices.
  • Political interference in official statistics leads to serious reputational and institutional damage and it is difficult to regain trust among staff and data users.
  • Before being able to regaining lost values and trust among our users, lost values have to be restored by an inclusive analytical process aimed at removing symbolic and physical barriers within the institution.
  • How can we communicate something which we do not actually trust?
  • It is important to use findings from behavioural sciences to improve communication with respondents and to increase the response rate and its quality.
  • Communication should be easy to grasp, attractive in appearance, social to build rapport and timely to encourage getting in touch.
  • Interviewers need training in using behavioural science techniques to gain respondents’ cooperation.

Suggestions for Future Work

Future Work: 

  • Mobile devices for data collection: Is it a different mode? How to adapt the design and/or create shorter surveys. Use of location data and 24/7 availability. They are not ideal but can increase response.
  • Design of web questionnaires: get consistent results across devices and browsers; integrate mode effect in design from start and facilitate use of multiple languages.
  • Differentials in response by modes and device and the impact on survey design.
  • How to make CAWI and CAPI work in household surveys.
  • Change needed in the organizational structure when moving towards (centralised) digital collection.
  • Skills and mind set needed for our future data collection methods.
  • How to integrate project planning into the survey development.
  • Develop methods to discover the level of unreported crime.
  • Using Minimum Viable Product approach not only for collection, but also for processing and production of statistics.

Future Work: 

  • Develop standardized method for response burden measurement: objective measurement and international comparable analytics.
  • Designing shorter questionnaires (what do we really need; only key questions; maximum result versus minimum size; paradigm change staff).
  • Elaborate on new methods/sources to reduce survey burden (integrating other sources; sampling techniques)
  • What is behind declining response rates (why do they drop; why do we respond or why not; incentives).
  • Improve partnerships and get in touch with data providers and know why they do and do not respond and find incentives for responding.
  • Change management: from stovepipe to other organizational structures and how to manage and promote change within organizations (MVP approach; share experiences).
  • How to improve the release of real time data?
  • Continue to work together and find ways to share ideas and skills.

Future Work: 

  • Paradata: how to structure, understand and optimize their use (e.g. for real time analysis, integrating them with metadata, geocoding, DDI for paradata).
  • Legal issues, ethics, privacy and data security with integrating sources and how to get public buy-in.
  • The use of machine learning for better quality of data integration.
  • How to integrate metadata and paradata from survey data, Big Data and other new sources into integrated systems.
  • Share data-matching and integration techniques and methods.
  • How to harmonize and unify entities across agencies.
  • How to define potential sources and classify variables within them to assess usability.

Future Work: 

  • Use of behavioural studies for survey design and communication strategies to improve response.
  • Respondent centred approaches to survey design.
  • Build partnerships with interest groups to reach challenging groups.
  • How to develop understanding of individuals who are not direct respondents but who influence the response and how to communicate with them.
  • Impact of personalized feedback, impact on response rate and quality, and what information to provide and how to deal with feedback that might reflects negatively on the respondent.
  • Effectiveness of providing incentive to improve response rate and impact on the quality (costs benefits).
  • Use nudging initiatives to increase response rates and response time. 

Lessons Learned and Topics for future work (main messages from presentations, no small group discussions were held):

  • It is important to define the value of official statistics and to brand our products to users and stakeholders in order to defend our value proposition.
  • We need a measurement framework and key indicators to measure the value of official statistics.
  • We need customer focused, innovative, collaborative official statistics designed to international best practice and share good practices.
  • Political interference in official statistics leads to serious reputational and institutional damage and it is difficult to regain trust among staff and data users.
  • Before being able to regaining lost values and trust among our users, lost values have to be restored by an inclusive analytical process aimed at removing symbolic and physical barriers within the institution.
  • How can we communicate something which we do not actually trust?
  • It is important to use findings from behavioural sciences to improve communication with respondents and to increase the response rate and its quality.
  • Communication should be easy to grasp, attractive in appearance, social to build rapport and timely to encourage getting in touch.
  • Interviewers need training in using behavioural science techniques to gain respondents’ cooperation.

Special Event Winners:

Chair's summary Session 1 & Session 2

In the first session of yesterday’s meeting we have discussed that national statistical organisations generally adopt a data collection strategy that they first use registers and secondary sources. They refer to primary data collection as the last choice. Nowadays, primary data collection rapidly moves to a paperless environment and papers replace by electronic collection. The form of electronic data collection in business surveys moves to web survey mode which provides a better respondent experience and timely and possibly more accurate data with a relatively low cost. The other advantage of web survey mode is that it presents a rich paradata to data collectors such as starting and completing time of a survey questionnaire or types of devices and browsers used, or chosen path to reach to the survey etc. For those who want to switch to web survey mode rapidly Statistics New Zealand’s Minimum Viable Product approach might be a great idea. And, in a statistical organisation, as the number of web based surveys increases it may no longer be efficient to manage and maintain data collection through single independent processes without having a platform or portal for it.

Another issue related to web survey is that; as the use of technological devices for data collection purposes increases, especially in electronic self-administrated questionnaires, the problem of partly completed forms due to dropouts grows, which may cause a big risk on data quality in the end. In order to reduce these dropouts Statistics Netherlands has very interesting suggestions to follow.

On the other hand, on the side of social statistics; some statistical offices, Destatis is among them have engaged in redesigning their social surveys to mixed-mode survey designs including web. Of course, the main motivation is the reduction of survey costs with at least same quality level as before. Mixed-mode designs are feasible but imply a more complex statistical process and monitoring of data collection as it can be seen in Israel LFS case. It also requires a flexible management system and a software-support. Together with mixed mode designs another opportunity discussed by Destatis in order to modernise social surveys is modularisation which suggests moving towards the adoption of a modular architecture of social surveys instead of continuing to design the surveys as stand-alone entities.

Bilal Kurban (Chair)

In the second session of yesterday’s meeting we have discussed that while measuring and monitoring actual response burden focusing only on response time as in the past will not be a sufficient solution. Such as in the case of Statistics Denmark and ISTAT, statistical organisations should also take into consideration the perceived burden in order to improve survey process and questionnaire design. While doing this, they must pay attention not to cause further burden to respondents. To be able achieve this; making use of paradata which is generated automatically by the software used for data collection is a good idea. The other good idea may be adding a voluntary or ad-hoc user evaluation section to surveys for the burden characteristics which are impossible to get without asking to respondents such as actual time used on collecting information, number of people involved in completing a survey questionnaire and perceived difficulty and burden of survey design, procedure and instrument.

On the other hand; survey non-response is a growing problem especially in household surveys. Therefore, the production of reliable statistics requires monitoring and improving data collection process. In this regard, it can be found very concrete and widely applicable ideas and initiatives in non-response reports of Statistics Finland that can help improve both response rates and data collection quality. It can be also useful to take advantage of new visualisation techniques for controlling data collection process as well as making use of mobile data collection possibility where applicable.

In order to overcome non-response and response burden challenges with shrinking budgets, statistical organisations should find and develop innovative methods of data collection like Statistics Netherlands does. Standard Business Reporting and Internet web scrapers are good examples for these innovative methods. They both have the opportunity to provide statistical offices with 

enhanced amount of data to increase data quality without confining to sample selections. On the contrary to traditional SBR applications, in that there is generally a one-way data flow from businesses to NSI’s, supplying businesses with the data or tables they demanded is really a good incentive practice and a good example of a win-win situation.

Using internet data for statistical data collection purposes will probably be very common in the future because it reduces both data collection costs and burden on data suppliers. There are many technical, organisational and legal issues which must be solved before transition from traditional data collection to the use of internet. But, of all the challenges NSI’s have, perhaps behavioural one is the biggest. There must be an environment open to change within statistical organisations.

On the other hand, as the source of data collection shifts from primary to alternative, statistical authorities have to have a new employee profile having both IT and statistical skills. With the emergence of new data sources, now it is very important for statistical organisations to develop in-house (internal) analytical skills and competencies. At this point, training with online courses as in the CBS case may be a good suggestion to support data science skills among staff.

Bilal Kurban (Chair)



  • No labels