Government Statistical Service (GSS) Quality Strategy
Policy details
Metadata item | Details |
---|---|
Publication date: | 12 June 2019 |
Owner: | Data Quality Hub |
Who this is for: | Members of the Government Statistical Service |
Type: | Strategy |
Contact: | DQHub@ons.gov.uk |
Update May 2023
The Government Statistical Service (GSS) Quality Strategy was a two-year strategy that ran from 2019 to 2021. The current UK Statistics Authority strategy for the statistical system is Statistics for the public good.
For any queries on the quality of your statistics, contact the Government Data Quality Hub on DQHub@ons.gov.uk.
Foreword
Quality is fundamental to building trust in government statistics. The responsibility for producing high quality statistics lies with everyone working in the statistical system. Data and statistics are rapidly changing with the introduction of new methods and complex data sets. As our data increases in complexity it becomes harder to assess quality, but producing high quality statistics is more important than ever.
This strategy aims to improve statistical quality across the Government Statistical Service (GSS) to produce statistics that serve the public good. It was developed through consultations with statisticians from across government to identify the key priorities and challenges faced by the GSS. It sets out what we should be doing to improve the quality of our statistics and manage the processes surrounding their production. It will ensure that we comply with the quality requirements of the Code of Practice for Statistics. High quality statistics and sound knowledge about quality will lead to better decisions, helping us to achieve the aims of the Better Statistics, Better Decisions strategy.
I encourage you all to use this strategy to ensure that we are delivering high quality statistics that are trustworthy and inform sound policy decisions.
John Pullinger
National Statistician
Introduction
This is a two-year strategy. The strategy sets out realistic actions to address key quality challenges and improve quality across the GSS. While there are pockets of good practice, this strategy identifies areas in which we can improve the quality of statistics. For example, to reduce inefficiencies and improve the reproducibility of statistics, the strategy recommends implementing Reproducible Analytical Pipelines and automated processes.
According to the Code of Practice for Statistics, quality means that statistics fit their intended uses, are based on appropriate data and methods and are not materially misleading (quality is further defined in Appendix B). This strategy focusses on statistical quality but many of the principles still apply to quality in other disciplines such as modelling or management information. This strategy aligns with the core objective of the National Data Strategy; to foster a cross-government approach to better data use. The strategy is beneficial across all analytical professions in the GSS and aligns with the 2018 to 2020 Analysis Function Strategy.
The strategy has been developed by the GSS quality centre (now replaced by the Government Data Quality Hub) through collaboration with the GSS. The team, situated in the Best Practice and Impact (BPI) Division (note that BPI has since been split into several smaller teams), supports the GSS in meeting its requirements to maintain, improve and report on quality under the Code of Practice for Statistics. Further information and contact details for the team are detailed in Appendix A.
Our aim
As stated in the Better Statistics, Better Decisions strategy, we want to see our statistics enabling sound policy decisions and providing a firm evidence base for decision making, in and out of government. Improving the quality of statistics across the GSS will increase trustworthiness in the statistics and in us as producers. Thereby ensuring our statistics serve the public good.
Therefore, the aim of this strategy is to improve statistical quality across the GSS.
The GSS will achieve this through four goals:
- We will all understand the importance of our role in producing high quality statistics.
- We will ensure our data are of sufficient quality and communicate the quality implications to users.
- We will anticipate emerging trends and changes and prepare for them using innovative methods.
- We will implement automated processes to make our analysis reproducible.
This strategy supports the GSS People Plan which captures the work happening across the GSS to build capability.
Each goal is underpinned by a set of deliverables that build towards achieving that goal. The strategy’s deliverables are split between BPI (note that BPI has since been split into several smaller teams) and the GSS and we are all responsible for its success. The GSS Heads of Profession and quality champions are responsible for delivering on behalf of the GSS. How we will deliver the strategy and monitor progress is detailed in the delivering the strategy section. We are all accountable for delivering against the goals in the strategy but recognise the environment we work in will change. We welcome your ongoing input into the strategy and how we achieve this together.
Our goals
To improve statistical quality across the GSS we have set out four goals (as described in our aim). The goals are in no particular order. They are equally important and rather than being completed sequentially, we should aim to work towards them collectively.
We are all responsible for the quality of statistics we produce. Quality needs to be built into the whole process and at the forefront of analysts’ minds to produce high quality statistics.
Clear quality management, structures and practices should be implemented in each department to monitor, measure and improve statistical quality and ensure all those involved in producing statistics understand how their role directly impacts on quality. Roles and responsibilities on quality differ by level and it is important to distinguish between these.
Quality management: management level
Effective quality management should ensure the right checks and processes are in place so that the product is fit for purpose and we understand its impact on the European Statistical System’s (ESS) quality dimensions.
Some examples of processes to support quality management are:
- process maps: a map of the processes for each statistical release and the Quality Assurance (QA) required at each stage
- decision flow chart: a flow chart outlining the processes for who is responsible and accountable for what decision in different scenarios
- quality assurance documentation: accurate and up-to-date records of what QA has taken place through the process
- user consultation: consultation with users to ensure statistics are relevant and fit for purpose.
Quality structures: team level
For everyone in the GSS to understand the role they play in ensuring quality in government statistics, quality should be embedded into the culture of the department.
This can be done using:
- internal guidance documents: relevant guidance on quality tailored to your department
- quality discussion group: an internal group or committee set up with the intention of providing a platform to discuss quality, lessons learned and share best practice (see Welsh Government case study)
- peer reviews: review systems among colleagues
- curiosity panels: these are used at the Office for National Statistics (ONS) and chaired by a Deputy Director, they are an opportunity for the whole team to sense-check the results, agree the main messages, and place the findings in context
Quality practices: desk level
Analysts play a crucial role in maintaining and improving quality, their responsibilities are:
- input checks: understand and check the quality of the source data (we build on this in goal two)
- process checks: transform the source data into statistical figures and tables – use different approaches to see if the results are the same, perform aggregation checks and checks against the source data
- output checks: checks on the final published outputs: sense-check results, check for consistency within the final statistical bulletin, final check of figures and presentation
Deliverables
Deliverable | Details | Owned by |
---|---|---|
1.1 Facilitate the quality champions network | The quality champions network aims to improve the quality of statistics produced by the GSS. The quality centre will facilitate this network through organising quarterly meetings, creating agendas and providing support to quality champions. | Quality Centre |
1.2 Create a platform for sharing examples of quality guidance across departments | Many departments have already produced their own internal quality guidance documents. It would be beneficial to have a platform to share these across the quality champion network for creating and reviewing quality guidance documents. | Quality Centre |
1.3 Produce new quality guidance for the GSS | The existing quality guidance on the GSS website is currently out of date. The quality centre are undertaking a review of the current guidance and will evaluate and update our guidance offer to the GSS to reflect the updated Code of Practice for Statistics. We will work with international partners such as Eurostat and the UN to develop this guidance. | Quality Centre |
1.4 Identify and engage with wider networks focusing on quality | Understanding and improving the quality of data, analysis and models is a cross-government priority, and is not confined to official statistics. The GSS and quality centre should identify and engage with networks both within and across departments, and internationally. These include the United Nations Expert Group on National Quality Assurance Frameworks, and the Working group on quality assurance of government models. | Quality Centre and GSS |
1.5 Produce and document a clear structure for quality management | Some departments have a structure in place for quality management. Departments should aim to have clear governance and accountability for the quality of their statistics. It should cover the whole process from data collection to the statistical release. This should be documented and promoted within departments so that everyone understands the importance of quality and their role in producing high quality statistics. | GSS |
1.6 Participate in quality champions network | For the quality champion network to be effective, quality champions should participate in the activities of the network. This includes attending meetings, sharing best practice, being a point of contact on quality matters for their department and helping implement this strategy. If they have not done so already, departments should nominate a quality champion to represent them at the quality champion network. | Quality champions |
Case study: Welsh Government
Welsh Government have clear processes and procedures in place when it comes to their quality management. This approach is set out in their Statistical Quality Management Strategy.
One aspect of their quality management is their Statistical Quality Committee that meets quarterly and is chaired by the Head of Profession. The terms of reference for this committee sets out the purpose and role of the committee. This includes reviewing Quality Incident reports, updates on training, sharing best practice and any other quality related business. Having this framework in place not only provides a structure for quality management but is a method for getting quality on the agenda and bringing it to the forefront of colleagues’ minds.
The quality of a statistical product is underpinned by the quality of the data itself. High quality data are not sufficient to ensure high quality statistics but is a fundamental pillar of this.
Data quality should be managed through data management processes and analysts being curious about the data they are working with.
Curiosity about data
It is important for statistical producers to be curious about data and not take it at face-value. If there are values that look inaccurate they should be investigated and verified. Producers should understand the full data journey and be able to identify steps that are vulnerable and could introduce errors. A robust quality management system as described in goal 1 fosters an environment that supports this curiosity.
As set out in the Administrative Data Quality Assurance Toolkit toolkit, statistics producers should investigate the way in which the data are produced, manage relationships with data collectors and suppliers and communicate effectively with them (principles apply to both administrative and survey data).
Investigate
Examine the types of checks carried out by data collectors and suppliers, operational circumstances, coverage issues and potential sources of bias
Manage
Establish clear processes for data provision and managing change, maintain regular quality assurance checks of the data and use other data sources to corroborate findings where possible. Document what you find, and the decisions taken
Communicate
Work closely with data collectors, data suppliers and other statistical producers to ensure a common understanding of any quality issues and the reasons for any decisions made
Page 10 of Administrative Data Quality Assurance Toolkit toolkit includes five top tips that are helpful to think about when assessing the quality of data.
Communicating the quality of statistics to our users is a key part of Quality Assurance of Administrative Data (QAAD) and important for ensuring our statistics are not misinterpreted or misused.
Data management
Data quality is dependent on the data management processes of the organisation. If data are managed properly this results in high quality data. These data management processes include metadata management, data standardisation, data principles etc. In particular, using metadata, understanding the strengths and limitations and communicating this to users is important to quality. The Code of Practice for Statistics states that we should apply best practice in the management of data and data services. These standards should apply to both the department and any organisations collecting data on their behalf.
In the Office for National Statistics (ONS), the data management processes are mapped out in the ONS Data Strategy and is governed through various structures already existing in ONS along with a few new structures covering the gaps. The strategy is underpinned by a principles-based data management framework, comprising a set of data and security principles and a set of data standards. While other departments may not have the same resource to dedicate to a full data framework, the ONS example provides the key messages that should be included for effective data management.
Deliverables
Deliverable | Details | Owned by |
---|---|---|
2.1 Deliver training courses on quality | The Quality Centre deliver the QAAD workshop, Quality statistics in government course and the Communicating quality, uncertainty and change course. The principles taught in these courses communicate and reinforce the key points made above. | Quality Centre (now replaced by Government Data Quality Hub) |
2.2 Foster relationships with data collectors and suppliers | Build and manage relationships with data collectors and suppliers through regular communication with them. If communication is managed by a separate team, terms should be set for what this communication entails. Use these relationships to establish an understanding of the quality assurance checks required and undertaken by the supplier, thus improving the quality of our data. | GSS |
2.3 Identify if your department could benefit from undertaking BPI training courses | The training courses are free and available to everyone in the GSS. They can also be tailored to each team depending on what the key priorities and challenges are. | GSS |
2.4 Communicate the quality of data to users | The Code of Practice emphasises the importance of communicating the quality of statistics to users. This should be done by following the guidance produced by the quality centre on how to communicate quality, uncertainty and change. There is also a training course on communicating quality, uncertainty and change. | GSS |
2.5 Participate in the cross-government data architecture community | Data architecture in ONS facilitate this community to share best practice in data architecture and data management. Participation in this community fosters collaboration on best practice across the GSS. For information on how to get involved, please contact data.architecture@ons.gov.uk. | GSS |
Case study: Department for Digital, Culture, Media and Sport
The Department for Digital, Culture, Media, and Sport (DCMS) produce statistics on National Museum visitor numbers.
DCMS have been working to improve their relationships with each museum. This has included investigating their methods for collecting this data, introducing regular communication with them and logging the quality assurance procedures. This was not only done for the upcoming release, but they went back several years which revealed several inaccuracies that have now been resolved. While there is still some work to be done, this investigation has revealed where the Quality Assurance (QA) processes need to be at their most robust and the relationships built have helped to implement improvements.
At the rate that the world of data and technology is changing, there will be new and unfamiliar opportunities and challenges emerging for analysts. The detail, volume and frequency of data collected are rapidly increasing, as are the requirements for innovative methods, tools and techniques. The GSS needs to be prepared in this changeable environment to efficiently produce high quality statistics that reflect user needs.
National Statistician’s Quality Review (NSQR)
National Statistician’s Quality Reviews produced by the quality centre cover thematic topics of national importance conducted on behalf of, and for, the GSS. These are future facing reviews that ensure the methods used by the GSS are keeping pace with changing data sources and technologies. They complement existing quality assurance practices, providing an additional tool to make sure methods are – and remain – fit for purpose and among the best in the world. They provide an opportunity for experts outside the GSS to contribute to the continued improvement of the methods and support the GSS in identifying what good practice looks like for these methodologies as well as identifying opportunities for further development and investment.
The Quality Centre produced an NSQR on Privacy and Data Confidentiality Methods. New legislation, namely the General Data Protection Regulation (GDPR) (Data Protection Act in UK law) and Digital Economy Act (DEA) 2017, have brought about major changes in the way organisations process and share personal data across organisational boundaries. These developments presented an opportunity to innovate with data. This NSQR brought together world leading experts from across academia, the private sector, the GSS and leading National Statistics Institutes (NSIs) by preparing the GSS for the future and identifying opportunities to improve and innovate privacy and confidentiality methods.
The next NSQR will focus on data linking methods, with work commencing in June 2019.
How is this being supported?
ONS has established the Admin Data Methods Research Programme to address some of the key challenges of administration and transaction data as set out by Professor David Hand in his 2018 paper: ‘Statistical challenges of administrative and transaction data‘.
A work plan has been developed to take this work forward, working collaboratively across ONS, the GSS, academia and private research organisations. This programme of work will develop a statistical framework for advancing the use of administrative and transactional data.
The Office for Statistics Regulation (OSR) produce systemic reviews to examine cross-cutting statistical issues or improve the public value of a set of statistics. OSR examines these issues across the system to influence how the statistical system responds collectively to maximise quality and public value. They can be used to highlight good practice and innovation in elements of public value, with the goal to share lessons across the GSS.
Deliverables
Deliverable | Details | Owned by |
---|---|---|
3.1 Produce National Statistican's Quality Reviews (NSQRs) | The quality centre is responsible for producing NSQRs once a year. As such we should be horizon scanning for any emerging topics. Proposed topics are agreed by the National Statistics Executive Group (NSEG) and Heads of Profession to ensure that they are topical and sufficiently important for the GSS. The NSQR is then produced on the agreed topic. | Quality centre |
3.2 Facilitate Methodology Advisory Committee (MAC) and Methodology Advisory Service (MAS) | To provide support for quality in methodology, the quality centre will facilitate both MAC and MAS. MAC is a free methodological advice service with access to a pool of experts spanning academia, the private sector, the GSS and National Statistics Institutes (NSIs). MAS is a free service providing methodological advice and guidance. | Quality centre |
3.3 Engage with the development of NSQRs and take on board recommendations | The NSQRs outline next steps for the GSS, and understanding new development is crucial. Outcomes from the reviews inform further developments in methods used and identify where to build capability across the GSS. Task and finish groups are set up for some NSQRs to help implement next steps. For more information on these, email gsshelp@statistics.gov.uk | GSS |
3.4 Ensure there is space within departments to horizon scan for upcoming issues or opportunities | It is tempting to get caught up in the routine of producing statistics with little time to look at the wider picture. The implementation of Reproducible Analytical Pipelines (RAPs) (in goal 4) may free up statistical producer’s time to innovate. An example of this can be seen in the HPI case study. | GSS |
3.5 Regularly review publications | The Code of Practice is explicit about the need to regularly review statistics to identify if they should be maintained, changed or dropped (practice V1.6). Review the strengths and limitations (practice Q3.5) through consultation with users and review who our users are. | GSS |
Case study: United Kingdom (UK) House Price Index
Each month, the UK House Price Index (HPI) presents a first estimate of average house prices in the UK based on the available sales transactions data for the latest reference period. The first estimate is then updated in subsequent months as more sales transaction data become available. In March 2017, there was a large increase in the number of revisions between first and subsequent estimates. This negatively affected some users’ confidence in UK HPI.
After investigating, ONS established that they were being driven by volatility in new build property prices, compounded by an operational backlog in Her Majesty’s Land Registry registering new build sales transactions. Steps were taken to improve the methods by changing the calculation for the first estimate to reduce its sensitivity to the impact of new build transactions. The approach was developed by GSS methodologists, and several options were tested before a final one was chosen.
As a result, the scale of revisions to the first estimate of UK HPI annual change to average house prices has reduced and is more stable over time. This is an example of where an external change called for innovative methods to be developed to improve the quality of the statistics. Further information on this case study can be found in the Code of Practice for Statistics: Q2 case study.
We should be implementing Reproducible Analytical Pipelines (RAPs) or other automated processes in our statistics to reduce risk, improve auditability and take steps towards making our analysis reproducible.
Reproducible Analytical Pipelines
Producing statistics in an accurate and timely manner can be a meticulous, time consuming process. With open source software becoming more widely used, there is a range of tools and techniques that can be used to reduce errors and production time, while maintaining and even improving the quality of publications. A Reproducible Analytical Pipeline is a recreation of part of the statistical production process so that it can be easily reproduced, tested and audited. The key feature of RAPs is that they are reproducible; in the future we should be able to look back at anyone’s work and accurately reproduce every step in the process.
The potential time savings for analysts are substantial, greater automation of statistical production frees up their time to focus on the interpretation of the results. Another benefit of implementing RAP is building a process that is completely transparent, auditable and verifiable. Overall, this eases the quality assurance process, reduces risk and improves quality. More detailed information on the benefits of RAP can be found on this Reproducible Analytical Pipelines blog post on the Data in government blog.
Finding the balance
Implementing even a few RAP techniques can benefit the timeliness, auditability and quality of statistics. However, these approaches should not act as a replacement for an analyst overseeing the full end-to-end process. Automated processes should not be implemented and forgotten, once they are in place they will need to be consistently reviewed and quality assured. Automation is about people, we still need analysts that are knowledgeable about both coding and statistics to be effective. The human aspect is equally important, as an analyst would be able to identify issues that the code may not pick up. RAP should not be used as a method to produce minimal commentary in releases, this is another aspect where the human element is so important.
Finding this balance is the key to implementing RAP successfully. This balance is likely to differ between teams depending on the level of interaction and interpretation needed for the data.
Deliverables
Deliverable | Details | Owned by |
---|---|---|
4.1 Facilitate the RAP champion network | The RAP champion network has been set up to provide expert advice and monitoring for implementing RAP across the GSS. It looks to build relationships, share best practice and lessons learnt. | Analysis Standards and Pipelines Team |
4.2 Provide training and support in RAP implementation and coding | To implement RAP, we need to have the appropriate training in place. There is free training available that has been produced by the GDS. This goes hand in hand with the RAP companion. Along with a base knowledge of R and the free training course, producers should be able to start implementing RAP themselves. Further support is available through the RAP champions network. We will also provide resource support through BPI's data science offer to provide technical support for RAP implementation. For further information, contact gsshelp@statistics.gov.uk. | Analysis Standards and Pipelines Team |
4.3 Have further RAP or automated processes than are currently implemented | In the consultation phase of developing this strategy it became clear that different departments are all at different stages in implementing RAP. Departments should make an effort to increase the level of automation where possible and helpful through the lifetime of the strategy. In particular, departments should look to identify areas of vulnerability or risk (e.g. where there is a surplus of manual processes) and look to implement in these areas. | GSS |
4.4 Nominate a RAP champion | To stay up to date on the latest methods and establish a cross-GSS support system for implementing RAP. If they have not done so already, departments should nominate a RAP champion to represent them at the RAP champion network. | GSS |
Case study: Department for Transport (DfT)
DfT highly encourage implementing RAP into their processes. They have set a target of 2020 for each team to have RAP implemented in at least some part of their processes. Statisticians build this capability through training courses, RAP champion network and a coffee and coding club. If they can identify a piece of work or project that would benefit from the use of automated processes they are supported to get the relevant training. The coffee and coding club is an informal setting in which colleagues can discuss coding and any challenges they are having. These were originally run monthly but were so popular they now run weekly.
The data for search and rescue helicopter statistics are produced by aggregating monthly spreadsheets from the Maritime and Coastguard Agency. These were checked by hand to identify any discrepancies. These checks have recently been automated using R, with a report produced by R Markdown including maps, counts and tabulations. This makes it quicker and easier for the person carrying out QA to spot discrepancies. Further information on this example in the Code of Practice for Statistics: V4 case study.
Delivering the strategy
We have outlined the goals we have set to improve statistical quality across the GSS. An integral part of achieving this is how we put this into practice. Both the GSS and the Quality Centre will monitor progress of the strategy.
Monitoring the strategy
Each department should draw up an action plan outlining what steps they will take through the lifetime of the strategy to achieve the GSS deliverables across the four goals. These should be produced by the quality champion and Head of Profession through consultation with all GSS members in their department. A template and example action plan is available on this page.
Quality champions will work with their Head of Profession to provide biannual updates on the agreed actions to the Quality Centre. This will enable progress of the strategy to be measured.
The quality centre will produce biannual updates for the Statistical Policy and Standards Committee (SPSC) on overall progress on implementing the strategy.
After the two-year lifetime of the strategy, the quality centre will review the deliverables and update the strategy. We will be flexible on delivery as the environment changes. We welcome input and feedback into the strategy, please email: DQHub@ons.gov.uk.
Appendices
The GSS Quality Centre
Who we are
The GSS Quality Centre (now replaced by the Government Data Quality Hub), supports the GSS in meeting its requirements to maintain, improve and report on quality under the Code of Practice for Statistics. The team provides mentoring, expert advice, consultancy, training and guidance on the quality of official statistics. In addition to this, the quality centre provides strategic direction on quality across the GSS. We work closely across our division to deliver a quality service to the GSS.
The GSS Quality Centre supports the whole GSS community but is not restricted only to the GSS, we can and do support other government professionals. We are happy to meet with other statistics producers to discuss the services and support we offer and the best ways we can meet their needs. To get in touch please email: DQHub@ons.gov.uk.
Our vision: Where do we want to be?
- Take a leading role in identifying and addressing emerging quality challenges faced by the GSS.
- Become widely known across the GSS and wider as a centre of expertise.
- Have a measurable impact on the quality of government statistics.
- Build knowledge and capability in the team and across the GSS.
- Work flexibly and collaboratively across the team and celebrate success.
Our vision ties into the collective strategy for official statistics set out in Better Statistics, Better Decisions strategy and the Code of Practice for Statistics.
Best Practice and Impact Division (BPI)
BPI has been created to support the GSS to improve government statistics.
We provide a range of services to all those working in statistics across government. The division works towards this through seven themes: providing strategic direction, sharing best practice, consultancy, building capability, tools and standards, assessment and monitoring and one GSS voice. Under these themes, BPI supports the GSS through advice, consultancy and training.
Please note that BPI has since been replaced by several smaller support teams.
The Government Statistical Service (GSS)
The GSS is a cross-government network, spread across a whole range of public bodies, including components of the devolved administrations and UK government departments. Led by the National Statistician, it includes statisticians, researchers, economists, analysts, operational delivery staff, IT specialists and other supporting roles.
The GSS community works together to provide the statistical evidence base required by decision-makers and support democratic debate, publishing around 2,000 sets of statistics each year, and providing professional advice and analysis to decision-makers.
The Statistical Policy and Standards Committee (SPSC)
The Statistical Policy and Standards Committee (SPSC) assist the National Statistician in promoting and safeguarding the quality of official statistics. They develop and promote statistical policy and drive improvement in statistical methodologies, standards and classifications. SPSC reports to the National Statistics Executive Group (NSEG) providing biannual updates to agree objectives and priorities for statistical policies and standards.
Quality definition
There are several definitions for quality, it is most usefully defined in terms of how well outputs meet user needs, or whether they are ‘fit for purpose’. The Code of Practice states that quality means that statistics fit their intended uses, are based on appropriate data and methods and are not materially misleading. This definition is a relative one allowing for various perspectives on what constitutes quality depending on the intended use.
Quality dimensions
In order to determine whether outputs meet their needs, we measure quality in terms of the five quality dimensions of the European Statistical System (ESS):
Relevance
The degree to which statistics meet current and potential user needs in both coverage and content.
Accuracy and reliability
Accuracy is the closeness between an estimated result and the (unknown) true value.
Reliability is the closeness of early estimates to subsequent estimated values.
Timeliness and punctuality
Timeliness is the time gap between the publication and the reference period of the estimate.
Punctuality is the gap between planned and actual publication dates.
Accessibility and clarity
Accessibility is the ease with which users can access the data.
Clarity is the quality and sufficiency of the metadata, illustrations and accompanying advice.
Coherence and comparability
Comparability is the degree to which the data can be compared over time and domain.
Coherence is the degree to which data that are derived from different sources or methods, but refer to the same topic, are similar.
Action plan and reporting template
The quality centre have created an action plan and reporting template. This includes an example action plan to aid departments in filling out their own. If you would like a copy of this please email DQHub@ons.gov.uk.
Contact
Email: DQHub@ons.gov.uk