How to refer to numerical information
As part of statistical production, government bodies:
- collect numerical information
- undertake research
- make estimates of various kinds
They will often wish to make these numbers public. These data can sometimes be described as ‘official statistics’ when they are made public. The Code of Practice for Statistics applies to these releases. We refer to the Code of Practice for Statistics as ‘the Code’ throughout this guidance.
There are several terms that can be used to describe different types of numerical information.
What we mean by ‘administrative data’
Administrative data refers to information that is not collected specifically for statistics or research. These data are collected by government departments and other organisations for uses such as registration, transactions, and record-keeping. The data are usually collected as a by-product of providing a service. Administrative data are often used for operational purposes and their statistical use is usually secondary.
What we mean by ‘management information’
Management information describes information that has been combined and collated. It is used in the normal course of business to inform:
- operational delivery
- policy development
- the management of organisational performance
Management information is usually based on administrative data, but it can also be a product of survey data. The terms ‘administrative data’ and ‘management information’ are sometimes used interchangeably.
What we mean by ‘official statistics’
These are statistics published by a Crown body, or a body listed within an Official Statistics Order. They are sometimes based on administrative data but can also be based on survey data. Official statistics follow the standards of the Code.
What we mean by ‘accredited official statistics’
Accredited official statistics are called National Statistics in the Statistics and Registration Service Act 2007. They are official statistics that have been independently reviewed by the Office for Statistics Regulation and found to comply with the standards of trustworthiness, quality and value in the Code of Practice for Statistics.
Management information or official statistics
Administrative data and management information are sometimes used to support official statistics. Sometimes management information is made public separately, and may or may not contribute to official statistics. Crime statistics are a good example of this.
This raises the question of which sets of published aggregate management information should, in the public interest, be treated as official statistics, and which may reasonably continue to be produced without full compliance with all aspects of the Code. You should speak to the Head of Profession for Statistics (HoP) about whether a particular set of data should be treated as official statistics. This decision should also be informed by the ‘Labelling Official Statistics’ GSS guidance document.
Not all cases will be straightforward. There may be some cases in which the UK Statistics Authority (UKSA) will decide that data should be published as official statistics in future, even if they have not been published as official statistics before. This would mean that the data must fully comply with the Code to help maintain public confidence in statistics.
Sometimes the UKSA may have a different opinion than the organisation that produced the statistics. The authority may decide that the statistics should be categorised as official statistics, in line with the recommendations of the Bean review.
Official statistics are also subject to pre-release access rules set out in secondary legislation. Pre-release access is the practice of making official statistics in their final form available before publication to specific individuals not involved in their production. This concept is easier to explain in cases where official statistics are produced from surveys controlled by statisticians, or when statistical techniques are used to combine data sources.
‘Final form’ statistics are statistics and commentary which have been quality assured and have been signed off by the responsible statistician. They are ready for publication.
In cases where official statistics are derived directly from management information, the management information could be the same as final statistics, or very similar to them. There may be concerns that sharing the management information before publication breaks pre-release access rules. It is appropriate to continue using management information in the normal way, including sharing data between government departments where necessary. The important constraint is to ensure that there is no public use of unpublished management information, which could undermine the official statistics and so breach the Code.
This guidance does not replace the specific rules governing pre-release access to statistics in their final form.
Understanding the difference between management information and official statistics
Decisions on whether data should be treated as official statistics should be taken in consultation with the Statistical Head of Profession (HoP) and the provisions set out in this guidance.
General principles for publishing management information
There are five main principles to consider when publishing management information.
Principle 1: Providing maximum value
Maximum value should be made of information held by the public sector to:
- inform policy and operational decision making
- help direct economic and commercial activities
- inform wider public debate
As the availability of near real-time management information has increased, so has interest in using it to inform decision making.
Government statisticians are there to help organisations make full use of management information for policy, operational, and managerial purposes. This includes management information that will form the basis of official statistics before they are released, if:
- conditions of use are in place to preclude the public use of unpublished management information
- the statistics are not in their final form for release
You should use the best available data when making important decisions. Sometimes this will be the quality assured official statistics, but other times management information can provide additional insights. You should be careful when interpreting management information as it is often incomplete. It is not always quality assured and not necessarily fully representative of a topic. Government statisticians can help in this situation by working to improve the quality of the data that comes from administrative systems, providing expert analysis and helping to interpret meaning and insight from the data.
Organisations should use the skills of professional statisticians to improve the information that comes from administrative systems and to achieve maximum value at all stages of data use.
Principle 2: Using proportionate safeguards
Where unpublished management information contributes to official statistics, it should be handled with care. A range of safeguards can be used to reduce the risk of a breach of the Code of Practice. Stricter safeguards are needed when data are very close, or identical, to final official statistics.
Principle 3: Equality of access
Wherever possible, there should be equality of access to the data from which any public statements are based.
Public statements should use the latest published official statistics. They should not be based on unpublished management information that contributes to official statistics. Where there is a good reason to use more up-to-date management information in a public statement, that data should be published before, or at the same time as, the public statement.
Sometimes it may be necessary to make a public statement based on unpublished management information that covers similar ground to a future official statistics release. In cases like these, seek advice from the HoP to mitigate the risk of pre-empting or compromising Official statistics. The HoP will need to be satisfied that such a use of management information is justified, and that is will not undermine the planned official statistics release, or broader public trust in official statistics.
Sometimes it may be necessary to publish management information that contributes to a later official statistics release ahead of publication. In cases like this, the HoP should review the publication schedule to see whether the official statistics might be released earlier. If this is not feasible then an ad hoc statistical release should be announced and published. If exemptions to the Code are needed, the UKSA’s regulatory function will need to be informed. The Authority is always available to provide advice in cases of doubt.
The term ‘ad hoc’ relates to statistical analyses produced and released where there is a pressing need for official statistics in the public interest.
Public statements can damage public trust in statistics if they reveal unpublished data before related official statistics are published. Depending on the content of the statement, appropriate responses might include publishing a statement about the circumstances or publishing an ad hoc statistical release.
Public statements should not be made based on unpublished management information that contributes to Official Statistics. If this happens ask for advice from the HoP or from the UKSA straight away. We cannot maintain public trust in statistics if we choose to release favourable data only.
Principle 4: Transparency
Transparency is an important part of building trust in statistical information and should guide decisions about the use and release of data and statistics.
More departments are making management information publicly available in line with the principles of Open Data. Publishing data in an orderly way is one of the best methods of ensuring equality of access and providing value for the public. You should:
- consult the HoP about whether to treat the data as official statistics
- use the skills of professional statisticians to improve the information derived from management and administrative systems
- follow your organisation’s processes for protecting information — this includes following the Data Protection Act and consulting the relevant Information Asset Owner, if necessary
- apply statistical disclosure control methods as needed
If the management information does not contribute to official statistics, the Code of Practice does not formally apply. But voluntary adoption of the Code is still advised, and the HoP can advise on which elements are the most relevant to the situation. For example, trust in the data would be enhanced if supported by information about methods and quality. The HoP should also consider whether the need to publish the data might be an indicator of its importance and therefore whether it should be published as official statistics.
The Office for Statistics Regulation has published a draft guide to voluntary compliance with the Code of Practice. This explains how organisations that are not formally bound by the Code can still use its principles and practices to:
- promote trustworthiness
- ensure quality
- improve the value of information for the public
We recommend that departments use this guidance when publishing information that is not covered formally by the Code.
The more data an organisation can publish from its administrative or management sources as official statistics, the less likely it is that the organisation will have to make an unplanned release under the Freedom of Information Act.
Principle 5: Integrity of Official Statistics
Not all situations will be straightforward. Sometimes these principles may conflict with each other. The HoP should decide how to balance these principles to ensure that no action is taken that might undermine confidence in the independence of related official statistics when they are released.
Internal use of management information
Management information systems are rarely owned or controlled by statisticians. This means you will need to work with colleagues in other parts of your organisation to ensure best practice is followed.
Unpublished management information should be handled carefully where it contributes to official statistics. A range of safeguards can be used to reduce the risk of a breach of the Code of Practice. You should work with the HoP to make risk-based judgements about the appropriate level of safeguards needed. Stricter safeguards are needed when data are similar, or identical, to the final official statistics.
Safeguards might include:
- limiting access to colleagues who need to see the data — this should include only people who have a legitimate need to use the data for policy, managerial, operational, or other appropriate decision-making purposes in advance of the official statistics publication
- clearly marking data as ‘sensitive’
- ensuring people with access understand their responsibilities under the Code and know they must not put the data in the public domain
- keeping records of who has access to the data — if the data are particularly sensitive it should be a requirement to keep a list of the names of people who have access to the data, but if the data are less sensitive you could record details of groups of people with access applying clear ‘conditions of use’ for access to the data
Users of unpublished management information that contributes to official statistics must:
- follow the ‘conditions of use’ attached to the data — this also applies to anyone outside the owner organisation, such as staff in other organisations who may be collaborating on an initiative that spans several departments
- avoid ad hoc or selective comments on, or reporting of, unpublished data
- avoid making any public statement that prejudges or pre-empts the contents of any future statistical release
When to publish management information
Any public statement should ideally use official statistics if they are available. But if someone who has access to internal management information wants to use it in a speech or statement, an assessment needs to be made about whether it is in the public interest for this information to be used before the scheduled publication of official statistics. You should consult the HoP who can advise on options, and the data should be published using the methods set out in this guidance.
We must avoid the selective publication of favourable data, or any action that might encourage such a perception. Departments should aim to publish official statistics according to a timetable which achieves a balance between:
- the timeliness of release
- user needs
- resource availability
This will help to avoid the need to publish unscheduled data.
You should consider a routine Official Statistics publication if there are repeated ad hoc releases of similar content.
Public statements can sometimes unintentionally include references to unpublished management information that contributes to official statistics, despite best efforts to avoid this. Such statements can damage public trust in statistics. Do not try to hide it if this happens. Ask for advice from the HoP, or from the UKSA straight away.
Appropriate responses will depend on the content of the statement, and may include publishing a note confirming the source of the data and any relevant context, or publishing an ad hoc release.
It might be appropriate for an audit or investigation to identify whether any additional safeguards should be put in place. It may also be appropriate to invite statisticians to review the text of speeches or statements to check any references to data are both accurate and appropriate for the public domain. In some departments statisticians are asked to check briefing packs for Select Committee appearances. Unpublished management information can still be revealed by mistake. If a significant piece of unpublished numerical information is revealed during a hearing, statisticians should be ready to respond appropriately.
How to publish management information
The core principles of the Code should be followed when any data is released, even if they are not considered official statistics. This means that:
- the data should be accessible
- data quality and limitations should be explained
- there should be clear separation between the published data and any policy or political message
There are four main ways that management information can be published. It can be published as:
- an ad hoc management information release
- a regular release of management information
- the release schedule of planned official statistics that can be changed to included relevant management information
- an ad hoc official statistics release
It is important that releases are clearly labelled as ‘management information’, ‘official statistics’, or ‘accredited official statistics’, as appropriate.
Scenarios and case studies
The different processes and systems in place across government make it difficult to create a comprehensive set of rules for all situations. Data owners, analysts and users must often make quick judgements about the correct course of action. To help make these judgements, the principles described in this guidance are expanded in a series of scenarios, demonstrated by real-life case studies.
Police recorded crime statistics
Crime records begin at the local level as administrative data. They are then collated as management information and used by the police, ministers, and officials to inform operations. The records are published regularly to the police.uk website as management information.
The management information is also aggregated and quality assured to create official statistics. These sources have different purposes. For example:
- the public, the media and policy makers us the official statistics to understand trends in crime
- ministers and senior officials use the monthly management information to identify emerging crime threats that need an immediate policy response
- the public use the police.uk website to map crime and identify hotspots
- academics use the official statistics and management information to develop models of crime
Data published to the police.uk website does not cover all crimes and is not in the quality assured form that is published as official statistics.
Official statistics about people starting apprenticeships are produced by the Department for Business, Innovation and Skills (BIS). The source of the data is a BIS administrative system designed to support funding and operational functions. There are lots of people who have operational access to the dataset in the Skills Funding Agency (SFA) and BIS. Because the apprenticeship programme is high-profile, it is subject to a range of formal boards and groups in government. BIS support these by managing the cascade of information, analysis, performance measures, and dashboards.
BIS supply data about people starting apprenticeships to the Earn or Learn Taskforce, which is chaired by the Minister for the Cabinet Office. These meetings are treated as operational, and BIS ensure they have the best data to make decisions. In line with the guidance, BIS share management information containing unpublished data that are ‘incomplete and not quality assured’, or ‘not fully representative’. These data may give an indication of trends and may sometimes be very close to the final official statistics.
The monthly management information for January was the same as what statisticians expected the ‘final form’ official statistics number to be. But the National Statistician’s office advised that while the data was close to the ‘final form’ number, it was not the same as the final form official statistics release which would need to comply with the process for formal pre-release access.
A market sensitive official statistics publication on Public Sector Finances is produced jointly by the Office for National Statistics (ONS) and HM Treasury. This brings together a wide variety of administrative data sources from across the public sector, including HM Revenue and Customs (HMRC) tax data and HM Treasury data from cash monitoring systems. Approximately 50 people have access to the administrative datasets within HMRC and HM Treasury. Approximately 30 people have access to compiled management information circulated by the Treasury, including Treasury ministers, senior management, and people from other departments.
There are three main uses of the data.
Tax teams and senior managers monitor progress on tax receipts, sometimes on a daily basis or on days that affect forecast judgements. The Treasury and Debt Management Office (DMO) use the data to ensure best value for money decisions are made around daily financing of activities.
Treasury senior ministers and staff from the Office for Budget Responsibility (OBR) and DMO monitor progress against fiscal targets. They provide a real time indicator of the economy and the government’s cash position.
Media, commentators, and the government monitor the general fiscal position and the gilt market. They do this to assess future supply and demand for gilts.
Some of the data is in draft form and can be very different from the final published statistics. Later in the monthly cycle the data is closer to the final statistics, but this can still change quite a lot once it is compiled by ONS.
As the source data and management information underpins market sensitive releases, there is an urgent need to protect the data, and to ensure no public statements are made.
The main mitigations are to:
- ensure that conditions of use are signed by all recipients of the management information, including ministers and people from external organisations
- ensure that everyone is aware of the restrictions on the use of their data
- tightly control circulation to an expected schedule so that ministers and senior stakeholders know when to expect news and do not seek it from other sources
Operational data on waiting times is collected in the administrative systems of hospitals across NHS Scotland. It is used for management purposes by hospital managers. NHS Scotland collates this information and sends weekly aggregated data to the Scottish Government on a defined set of indicators. These data are used for discussions with ministers, and for management discussions between the Scottish Government and NHS Scotland Boards and Hospitals. Weekly and monthly official statistics are published by NHS National Services Scotland (NSS) based on the same data source.
It is important that data is exchanged between NHS Scotland and Scottish Government to ensure that the Scottish health system can be managed and controlled effectively. People who receive the data are therefore generally aware of what the official statistics are likely to say. So, to ensure the official statistics are not undermined, there is an agreement that Ministers and officials will only pro-actively comment in public on data presented in official statistics. They can however respond to management information that has been published by others, such as individual Health Boards.
Soon after Ofsted moved to a new framework for inspecting schools, which involved the introduction of short inspections, there was public concern that only a few inspections had taken place and about how the new framework was working.
Ofsted publishes statistics every school term about inspections of maintained schools. They also publish some monthly management information. But His Majesty’s Chief Inspector wanted to make a public statement about short inspections and refer to more up-to-date information on the number of completed short inspections. The latest published management information showed 207 inspections, whereas over 300 had taken place when the Chief Inspector asked for the data. Details of all these inspections had already been put in the public domain in the individual inspection reports, but the published data had not caught up with the latest position.
Before the Chief Inspector made his speech, a special release of management information was published. This ensured equality of access for everyone. While there was already information in the public domain on over 300 inspections, the combined total was not readily available.
In February 2016, Number 10 wanted to use the data in a speech the Prime Minister (PM) was making on prison reform. If the PM had used unpublished data, there was a risk this would not have been verified and it would not have followed correct release practices. It was decided to publish a single ad hoc release so that the data would be in the public domain at the time of the speech.
One difficulty was ensuring the quality of the data as there was not much time to produce the ad hoc release before the speech. Because of the short deadline to validate the data, some available data were not included to avoid quality concerns.
Statisticians worked with Private Office officials to understand more about what the PM wanted to say. By doing this they were able to find a compromise that addressed the issue the PM wanted to speak about while ensuring the speech accurately reflected the data.
There was significant public interest in the weeks after the release and speech. This has led to discussions about whether the management information should be published regularly.
A senior civil servant who was giving evidence to a parliamentary select committee unintentionally made a statement based on unpublished internal management information they had been briefed on.
The Chair of the select committee then asked a parliamentary question to get some context for these data. The relevant Minister confirmed that the information had come from management information, which would be included in the next quarterly publication. The Chair of the select committee then wrote to the Chair of the UKSA, complaining that it was difficult for the select committee to hold the Government to account if unpublished data was used in evidence.
The Chair of the UKSA replied publicly after the matter had been investigated. They said that in this case the relevant HoP had not been involved in producing or checking the briefing given to the senior civil servant. This was against the department’s usual practice, where the relevant analyst was usually asked to check and sign off the accuracy of the statistical and management information being used in briefings for select committee hearings. The Chair expressed regret that the department’s usual practice had not been followed. They suggested that the corresponding official statistics could be published on a more frequent basis to help avoid the situation of officials quoting unpublished data.
A Legal Aid Agency (LAA) team of around 10 caseworkers process and determine applications for Exceptional Case Funding (ECF) for cases beyond the normal scope of the legal aid system. Managers and legal advisers keep a close watch on the management information because it relates to a new scheme which is still being tested and challenged in the courts.
There is a quarterly statistical release, which is usually published three months after the end of the quarterly period that the statistics relate to. This allows two months for the application to be processed and decided, and three weeks for the statistical bulletin to be compiled.
It was necessary to provide more up-to-date data information for a court case. After getting the information from the National Statistician’s office, the HoP decided to preannounce and publish an ad hoc release of early data. This was clearly labelled as management information, with caveats about differences from the official statistics. This ensured equality of access, while still complying with the deadline set by the court order.
The Office for Standards in Education, Children’s Services and Skills (Ofsted) publishes official statistics each school term about the outcomes of school inspections, or ‘inspection judgements’. It also publishes each inspection report. In the past, the official statistics were the only published aggregated figures on inspection outcomes. However, advances in technology mean that it is now possible to scrape the individual inspection judgements from Ofsted’s website and aggregate the data. This is done by other organisations and at least one non-government organisation publishes their own aggregated data and infographics for school inspection judgements, updating them twice each working day. The statistics they present are different from the official statistics in both coverage and methodology. Nevertheless, some users, including the media, have referred to this data rather than the official statistics, possibly because more up-to-date information was perceived to be better.
This indicated a user need for more timely data. The HoP considered whether this need could be met by publishing frequent management information, or whether this would risk pre-empting the official statistics, as both rely on the same data set.
In this case, timeliness is a user-driven quality measure, that takes some priority over accuracy in this case. Ofsted now publishes management information every month. The data has some bias, so they give clear warnings to users about the issues with data quality. Management information is not concealed, even if it gives an indication about the official statistics.
Ofsted is also planning to update its Data View infographic tool each month to meet user needs. Ofsted will consider the possibility of publishing daily updates to the tool with clear warnings about the quality of the data.
Information about the processing of student loan applications in England is published in an annual accredited official statistics release. In 2009 there were significant problems with the processing of applications, which got a lot of media interest.
The Student Loans Company were getting a lot of enquiries about the problem and asked for advice from the HoP at the parent department, which was the Department for Business, Innovation and Skills (BIS). The organisations decided to introduce a series of interim official statistics publications on processing of student loan applications in England. The official statistics would be published every two weeks. By increasing the frequency of publications, the press office could direct people to the website to provide consistent, up-to-date information, and a transparent response.
The series started in October 2009 and finished in February 2010. Similar arrangements were also made for the next three academic years. After this, it was decided that there was no demand for a fortnightly or monthly release because the applications processing was working efficiently, and there was much less public interest.
The Department for Work and Pensions brought forward official statistics on Personal Independent Payment cases before a Work and Pensions Select Committee meeting. If they had not published the statistics before the meeting they would have been discussing official figures that were two months old and presented a different picture of performance. The departmental representatives may also have been at risk of accidently mentioning unpublished data.
The lead analyst for that area wrote to the HoP to ask for permission to release the figures early in an ad hoc release to avoid prejudicing the debate. Only data that was considered relevant to the committee session was published. The HoP agreed and the information was released four days before the Select Committee hearing.
From 6 August 2011 to 8 August 2011 there were outbreaks of public disorder which began in London and spread to other cities in England. During this period and the aftermath, the Ministry of Justice (MoJ) received daily information from the courts who were dealing with defendants identified as being involved in the public disorder. The information was taken from court registers.
The courts data met the needs of His Majesty’s Courts and Tribunal Service (HMCTS), National Offender Management Service (NOMS) and other Criminal Justice agencies for demand planning purposes. This information was linked to the prison population data to help ensure there was enough space in prisons to deal with the people involved in the public disorder.
There was significant public interest in timely data on the processing of defendants. The need for the data to be released quickly outweighed the benefit of waiting for the additional accuracy and commentary from the accredited official statistics release.
MoJ worked quickly to publish twice weekly releases of limited analysis of the outcome of the initial court hearings. More comprehensive bulletins were released later covering:
- sentencing outcomes
- prison population
- previous offending histories of those involved
The accredited official statistics on court outcomes are based on completed cases. But in this situation, it was important to publish information on initial outcomes at the magistrates’ courts, even if the case was still ongoing. Waiting until all the cases had completed would have resulted in a delay of several months before any information had been released to the public. This would have meant losing the opportunity to show how the criminal justice was responding to the public disorder.