This item is archived. Information presented here may be out of date.

Harmonising socio-economic background across the Government Statistical Service (GSS)

Natasha Bance

Harmonised standards provide the best way to gather data on a particular topic. By having consistency in data collection methods analysts can:

  • get more meaning out of data
  • successfully compare data sets more widely

Consistency of data is achieved by having clear, consistent definitions and question sets that have been worded to collect the data in the best way possible. These question sets are designed to produce clearly defined outputs that can be easily compared.

The Harmonisation team in the Government Statistical Service (GSS) are working to improve the comparability, coherence and consistency of statistics in the UK by implementing harmonised standards.

The need for a socio-economic background harmonised standard

The Social Mobility Commission (SMC) defines socio-economic background (SEB) as:

“…the particular set of social and economic circumstances that an individual has come from. It permits objective discussion of the influence of these circumstances on individuals’ educational and career trajectories; and it can be objectively measured by capturing information on parental occupation and level of education.”

Establishing a harmonised method of measuring SEB is an important part of the work to improve social mobility in the UK. Many organisations collect data about SEB, but there is not currently a harmonised standard on this topic.

By creating a new harmonised standard for SEB we can:

  • help organisations better understand the composition of their workforce — once they understand their current workforce, organisations can work to boost their socio-economic diversity and promote inclusion
  • help government departments capture high quality and coherent SEB data, which can support policy recommendations

There has been a lot of work over the last few years that has shown several Civil Service departments are interested in measuring SEB. Several publications have emphasised the need for an accurate measurement of SEB and highlighted significant data gaps. These publications include:

After the publication of these reports it was recommended that SEB harmonised standards should be developed, reviewed and updated regularly to reflect changing social norms and needs.

We made sure we included this work in our Harmonisation Workplan, which was published in February 2022. An update on this work was also included in the Winter 2022 harmonisation update.

Our approach and work

In the Harmonisation team, we use an iterative agile question development process along with respondent-centred design techniques to create harmonised standards. This means we can make sure we’re capturing user needs throughout the development process, ensuring our harmonised standards are of the highest possible quality.

The harmonisation process began with desk research to establish user needs and understand the questions that are currently being asked to measure SEB. We started to redesign the harmonised question according to question design best practice and accessibility guidance. We then discussed this work with our topic group. The topic group includes members from various government departments and Devolved Administrations. This means we can ensure our standard can be used across the GSS and that it will increase comparability of data across the UK.

After the first topic group session, we ran a round of cognitive interviewing and a quantitative survey. This work took place in the Summer and Autumn of 2022. These findings informed the redesign of the harmonised question by improving usability.

We recently completed a second round of cognitive interviewing to test how the redesigned question performs. We are currently analysing the findings from this round of testing and will publish the results in late Spring 2023.

Our findings so far

Overall, we have found that the question set works well and only minor adjustments were needed. Most participants understood the terminology we used, such as “free school meals”, “employed”, “self-employed”, and “occupation”.

Our findings suggest that small wording changes and additional guidance may be useful to help participants think about their answers to the question set. For example, we changed the guidance in the supervisor question from “day-to-day” to “regular basis” because we found that some participants only had supervisory duties on some days. This meant that the guidance went against their understanding, or “mental model”, of the question, so they were unsure how to answer.

One of the main findings from the first round of cognitive testing was that the parental occupation question was difficult for some participants to answer in its original form. For example, one participant said “Okay, there’s a lot of options here. So, I really need to focus and read up slowly and take in all the information”. This showed that some participants found the question too long to answer easily. This prompted us to redesign this question in several ways. We reduced the number of categories from 8 to 5 to reduce participant burden, whilst still providing distinct and informative responses. This did not affect the process of outputting the data as some 8 options were aggregated.

We also changed the guidance that appears with the answer options based on findings from the quantitative testing. The changes were designed to help participants select the right answer to describe their occupation. For example, previously there were options for both “modern professional” and “traditional professional”, but we found that participants did not understand the difference between them. We checked the methodology use on this question and found they gave the same output. We then combined the two options, which has worked well in later testing.

We also investigated a potential wording change in the free school meals question through quantitative testing. This research explored changing the question stem to ask  whether free school meals were “received”, rather than whether the participant was “eligible” for them. But we found that the change could affect data quality and fail to capture a relevant group of respondents. This is a good example of when our testing has shown that a change was  not necessary.

We have worked to ensure our questions meet best practice and accessibility guidelines. For example, we assessed the readability of our question set before testing to make sure it is suitable for the average reading age in the UK. Our question testing has helped us assess whether most users are able to understand the language we use in our questions.

Upcoming work

Our upcoming plans are to:

  • complete our analysis of the second round of cognitive testing
  • hold a third topic group meeting to discuss the findings from the cognitive interviews
  • publish the new harmonised standard on socio-economic background in late Spring 2023

The new standard will be published for self-completion methods, including online and paper modes. We will also give guidance for interviewer-led modes, including both face-to-face and telephone modes. The standard will be available to use in surveys and relevant administrative data collection, which will enable comparability of SEB across government.


As we continue with our analysis of the second round of cognitive testing, we have made substantial progress on the creation of a new harmonised standard for SEB. This round of testing will help us further refine the standard and ensure it is comprehensive.

If you would like more information on this work, or would like to share your data needs with us, please email

Evie Webb
Natasha Bance
Evie is a Research Officer in the Harmonisation team, working on the Socio-Economic Background workstream.