Social capital harmonised standard

This harmonised standard is under development. We have published the standard while under development so that users may benefit from the research that has been completed. These questions have been through one round of cognitive testing, however users may want to test these further, or ask the GSS Harmonisation Team to complete further testing.

If you wish to produce statistics based on this harmonised standard or would like to provide feedback, please email

Policy details

Metadata item Details
Publication date:13 July 2021
Owner:GSS Harmonisation Team
Who this is for:Users and producers of statistics
Type:Harmonisation standards and guidance

What is harmonisation?

Harmonisation is the process of making statistics and data more comparable, consistent, and coherent. Harmonised standards set out how to collect and report statistics to ensure comparability across different data collections in the Government Statistical Service (GSS). Harmonisation produces more useful statistics that give users a greater level of understanding.

This harmonised standard is under development. This page sets out our work to date and is updated when there are significant developments on the topic.

What do we mean by social capital?

Social capital is a topic of interest across government. Definitions of social capital vary, so for this standard we have used the Organisation for Economic Co-operation and Development (OECD) definition, as this would allow for international comparisons. The OECD defines social capital as “networks together with shared norms, values and understandings that facilitates cooperation within or among groups”. This can be seen in four main aspects:

  1. personal relationships
  2. social network support
  3. civic engagement
  4. trust and cooperative norms

Research shows that higher levels of social capital are beneficial and can be associated with better outcomes in health, education, employment and civic engagement. This has been shown in various OECD reports, specifically discussing that human and social capital are important to well-being and economic growth.

Questions and response options (inputs)

The harmonised questions in this standard are designed to collect basic information, for use in the majority of surveys. They are not designed to replace questions used in specialist surveys where more detailed analysis is required.

Topic Question stem Response options 
Generalised trust On a scale of 0-10, where 0 is not at all, and 10 is completely, in general how much do you trust most people? [0-10 scale] 
Neighbourhood belonging To what extent do you agree or disagree with the statement ‘I feel like I belong to this neighbourhood’? Strongly agree,
Neither agree or disagree,
Strongly disagree
Rely onTo what extent do you agree or disagree with the statement ‘I can rely on the people in my life if I have a serious problem’? Strongly agree,
Neither agree or disagree,
Strongly disagree
Civic engagement To what extent do you agree or disagree with the statement ‘I don’t have any say in what the government does’? Strongly agree,
Neither agree or disagree,
Strongly disagree
VolunteeringOver the last 12 months, have you given any unpaid help to clubs, groups, charities or organisations? Yes,
VolunteeringThinking about all the unpaid help you provide to groups, clubs or organisations, how often have you done this over the last 12 months? At least once a week,
Less than once a week but more than once a month,
Less often than once a month

Using this standard

Guidance for data collection

Please email if you are interested in using the questions in the standard. This is so that we can discuss with you the work that has previously been conducted, and allow us to support you in implementing the questions while the standard is under development. This may include exploring, with you, what research may provide you with the appropriate level of confidence to do so.

Types of data collection this principle is suitable for

These questions were tested for online mode and have not been tested in paper mode or interviewer led modes. We will update this page if we are made aware of the questions being used in other modes.

The first question in the question block is suggested to be presented with a write in box for respondents to enter a figure between 0 and 10. This is explained further under the ‘Development of this standard’ section.

Using this question in the Welsh language

This harmonised standard was designed in the English language. At present we do not provide a Welsh language translation, as user demand for this standard is UK wide and Welsh language testing has not been completed to ensure a translation is comparable and appropriate.

Harmonised standards based on Census research have been tested in the Welsh language, which is why we are able to provide Welsh versions of them. If you are interested in using a Welsh language version of a harmonised standard that has not been translated, please email us at

Examples of when this standard has been used

Surveys that used this principle

Currently, as the standard is still under development, it is not widely used by other surveys, however the questions have been designed based off of similar questions used in other surveys. Please email if you would like further information on which surveys use similar or comparable questions to those in the standard.

Presenting and reporting the data (outputs)

As this standard is still under development, we have not yet developed a harmonised way to report the data. If you have used this standard and would like to discuss how you should output the data, please email


As discussed in ‘what we mean by social capital’, we are using the definition from the OECD for social capital. Other surveys may define social capital difference and as a result may affect comparability. Due to the principal component analysis conducted by the Office for National Statistics (ONS) the comparability should be increased, however if you would like to discuss further, please email

Development of this standard

In 2017, the Harmonisation team notified the National Statistics Harmonisation Group (NSHG) that they had held discussions with the Quality of Life Team at the ONS, looking to update the standards as they hadn’t been reviewed since inception in 2003. As can be seen in the archived reports, while similar, the areas were not consistent with what was identified by the OECD in 2013 as part of an international framework.

The standard was developed through principal component analysis and discussions with the topic group. This topic group was made up of data users and producers who had an interest in the subject of social capital, from different central government departments, devolved governments, and other organisations.

These questions were cognitively tested in February 2021.

Cognitive testing

The method used for testing was cognitive interviewing with retrospective probing. This was conducted remotely due to the coronavirus (COVID-19). For further information, please email With the exception of the civic engagement question (which will be discussed in the findings of the question), the questions referred to in the findings are identical to the wording within the question table on this page.

General findings

When discussing the question block as a whole, generally participants said they found the questions easy to answer and that they were comfortable, which is beneficial as will minimise respondent burden. However, it should be noted that these questions still have potential to be emotive.

This can particularly be the case where the questions might not be expected, with participants also noting that some context would be appreciated. Therefore some consideration needs to be had when deciding to include these questions, including what information should be provided both alongside them and beforehand.

Neighbourhood belonging

This question was based on an existing question asked in Understanding Society. It was successful in capturing neighbourhood belonging as a factor of social capital, predominantly due to participants talking about communities and relationships, and feeling confident when answering.

The main themes that came out of why they had answered the way they had at the question were the length of time they had been living in their neighbourhood, the relationships they had within the neighbourhood and subsequently whether they felt a part of the neighbourhood, and ideas around community. Participants would also discuss feelings of safety, security and comfort. These three aspects also fed into their discussion around what ‘belong’ meant to them.

Participants who had lived in the neighbourhood for a significant period of time were able to answer very quickly and confidently and would answer positively. However, those who had recently moved would reference more how long they had lived there for and would reference how well they knew the surroundings and neighbours. This was particularly seen in participants who had moved at the start of the pandemic.

With regards to how ‘neighbourhood’ was understood, participants assumed that it meant the area in which they lived and answered the question accordingly. This is how we wanted them to answer the question. Despite answering in relation to neighbourhood where they currently reside, some participants were unsure whether this was correct; questioning whether area of residence was indeed the target concept. This shows the potential for inconsistent answering, as well as increased burden on respondents. This may be because, to reduce bias, the participants didn’t know what questions they were going to be asked and it was the first question that they saw.

When talking about the term neighbourhood more generally, it was seen to be a walkable area which can consist of residential housing, streets and sometimes include businesses like local shops. Although it is worth noting that for some their mental model of a neighbourhood didn’t fit where they lived. This did not affect the participants’ ability to answer the question in line with what we want to measure (community cohesion) as well as physicality. There was not an agreed size on neighbourhood, with many participants thinking it would depend on the area in which you lived, however if it was an area like a town or city, that it could be a section of that area, not necessarily the whole area.

While there was not an agreed size to neighbourhoods, when asked how the term ‘local area’ related to ‘neighbourhood’ participants stated that local area is a larger than neighbourhood. ‘Local area’ also shared many characteristics as neighbourhoods such as still being an area that you knew well. Additionally, ‘local area’ may be seen to be official, with an example of it being tied to the local lockdowns that took place.

It is important to note that the team explicitly sampled participants from both urban and rural areas to see if there would be differences in how the term ‘local area’ was understood. From this, we found there to be little difference.

Rely on

This question was developed through conversations with our topic group and based on similar questions in surveys such as the Community Life Survey. Where participants had a good support network, they were able to answer this question quickly and confidently, even if the support network was small. And following this, participants generally had either neutral to positive feelings towards the question and answering the question. Where this differed was when participants had recently experienced the need to rely on others. With this, participants were then reflecting on what had happened to them; if it had been a potentially negative experience this could be uncomfortable for them. If it had been an experience where they were able to have the problem resolved, they expressed feeling neutral.

When discussing what ‘rely on’ meant to them, the things discussed surrounding being able to depend on somebody, whether that be just to talk to for advice, through to physically helping with a situation. This would be without hesitation and not expecting anything in return. Serious problems were seen to be issues that were large enough that you were unable to fix yourself and that could potentially affect your life in a significant way. Finances, mental and physical health, break-ups, crime, and career problems were discussed as examples of serious problems. Participants did however also note they could be short term temporary problems as well, such as a flat tyre, which while not necessarily serious, you may need to rely on someone to, for example, come and pick you up.

Generally speaking, those close to the participants such as partners friends and family were discussed as part of their social networks. Who they would ask may to help in practice depends on the situation, for example if family lived further away, they may speak to friends who live closer first. This might not be a specific thought that they have, or a specific group that they think about, but an instinct that they know they could speak to somebody if needed.

It should be noted that when the findings from this question were discussed with the topic group, it was felt that this was looking at whether you could rely on people in your social support network, as opposed to measuring companionship. While companionship and reliance on specific close friends are both elements of the same domain, if you wish to measure companionship specifically, there is a question within the Loneliness harmonised standard.

Generalised trust

Participants understood what the question was asking them, and for some they mentioned they didn’t have difficulty answering. However, there were instances where even if they discussed feeling they had understood the question, they were thinking about family and friends, rather than the concept set out in the question which is ‘most people’.

From this, we have recommended putting this question at the start of the question set, as it may then be less burdensome on respondents as they will be going from a broad concept (most people) to then more specific (neighbourhood, people in my life). It should however be noted that this ordering has not yet been testing (in the testing round, this question was in the middle of the block) and therefore this may need to be tested before being used within a survey. Please email for more information.

The question proposed for general trust comes from the OECD. The response scale here is ‘0-10’, which can be difficult to display in an online data collection. To understand the most appropriate way to display the response option, we presented respondents either with a list from 0 to 10 presented vertically down the page, or a free text box where they could type a numeral between 0 and 10. Pictures of these two presentations are included for further explanation.

Option one – list from 0 to 10:

Radio button survey prototype asking














Option two – free text box:

Free text box survey prototype asking

The images show the two options that were tested for the generalised trust question. Option one shows a vertical list from 0 (with the label ‘not at all) through to 10 (with the label ‘completely’). Option two shows a write in box, with the instruction ‘enter a figure between 0 and 10’.

With regards to both prototypes, there was evidence of participants having to re-read the question multiple times due to the feeling as though it was difficult to read, and with option one specifically due to the long list we observed a lot of scrolling. Participants’ understanding of the numbers on the scale could be summarised with this quote; “below 5 I don’t trust at all and above 5 I trust, I tend to trust most people, that’s why I put 5 it was in the middle”.

The term ‘in general’ was the most well-understood term in the question stem. Participants understood the term to mean on average, or the majority of the time, with the question not wanting to focus on a specific situation. However, the terms ‘trust’ and ‘most people’ were argued to be quite vague and increased the difficulty level of answering. This was predominantly due to trust being an abstract concept, combined with participants who would trust different groups of people in their lives differently.

When asked what ‘most people’ meant to them, participants stated that it meant the majority of people, however there was some contention over whether it would include people you just met. As for ‘trust’, participants defined it as having the ability to ask somebody for help, to be open and honest with someone and believe that they will keep your secrets and not intentionally hurt or upset you.

Civic engagement

This question was based on similar questions that are asked in surveys such as Understanding Society. While understanding the question and being able to answer confidently, participants discussed feeling quite negative emotions at this question. It wasn’t necessarily at the question design specifically, but it brought up emotions felt by participants around politics in general.

At the time of testing, the question asked respondents to rate levels of agreement with the phrase ‘people like me don’t have a say in what the government does’. Participants expressed feeling that the phrase ‘people like me’ was very negative and that it was making an assumption before they were able to answer the question.

This negativity in some instances then made the question more difficult for participants to answer. In terms of what they expressed ‘people like me’ meant, participants were split between how it could mean those with similar characteristics to them, or the vast majority of people, specifically those outside of government.

This separation between themselves and the government led to participants expressing negative language, discussing how they were without status and importance. Because of this, we have decided to take the phrase ‘people like me’ out of the stem and use the term ‘I don’t have a say in what the government does’ instead.

When answering the question, participants would use voting as an example of having a say, or for those who were unable to vote, this was the reason why they agreed with the statement. They also considered their influence over policy after voting. When asked what they thought ‘the government’ meant in this context, participants discussed thinking primarily about the UK government, making references to well-known politicians and things such as policies, Brexit and so on.

Devolved governments were also discussed by participants, however there was still a focus on UK government (Westminster) as participants expressed feeling that while devolved nations have some powers that ultimately power came from Westminster. When asked, participants stated that their answers wouldn’t change if they were asked on devolved governments specifically.

There was also mention of local governments, however similar to devolved governments, they were predominantly mentioned alongside Westminster. However, unlike devolved governments, participants were split on whether their answers would change if they were asked specifically on local governments. This was down to whether they felt they had more access to ‘local MPs’. Because of this difference, individual surveys may want to ask follow-up questions if wanting to gather more data in this area.

Volunteering question one

This question, as well as the second volunteering question, was based on the volunteering questions that were asked in other social surveys, for example Scottish Household Survey. For those who answered yes at this question, there was a wide range of activities that they did which included fundraising for charities, providing free services, corporate responsibility days, calling vulnerable people during the week, and donating clothes, food and furniture. While they were clear that giving time would be included here, there was less consistent agreement about whether donations (mainly financial) count.

For some, if their tasks did not fit their mental models of what unpaid help (or volunteering) then they didn’t include them when answering the question. This could be because it was part of their everyday work, or because they just thought it was them being kind or a good neighbour. It is worth noting that for some, if they had included these aspects of help that for some their answers to the follow-up would have been different. Work experience, overtime and other examples were also discussed as possible examples of unpaid help.

Unpaid help was seen to be a very broad term, where you would give your time and do things while not expecting financial reward or payment. When discussing unpaid help, participants throughout frequently referenced volunteering throughout, using the term when we asked them what the question was asking them in their own words.

Participants subsequently expressed feeling that if they had volunteered that they could answer yes, however it was discussed that volunteering could be seen to be more formal, and often is linked to clubs, groups, charities and organisations and being ‘on the books’. This may have influenced the participants to not include the less formal examples of unpaid help.

Clubs and groups often tended to be looked at together, with participants using examples of sports clubs or parental groups, focussing on extracurricular activities. As a result, clubs and groups were seen to not have much meaning given the context of the question with it looking at unpaid help and volunteering, with participants discussing that they wouldn’t think of them as being in need of help, as opposed to charities which were discussed and focussed on more.

It was the most collectively understood term, with participants referencing well known charities, as well as charity shops. Lastly, organisations were seen to be an umbrella term which was often discussed alongside charities but with a focus more on business. Participants discussed how it would be the club, group, charity or organisation that would organise the help, or may request help.

Volunteering question two

If participants had clear times when they had given unpaid help, they found this question easy to answer. However, if their unpaid help was not regular, or had changed throughout the year, participants could have a harder time, thinking about their time on average.

Participants expressed feeling that while the first and third options were easy to understand, the middle option can be tricky to read. Participants also discussed what response options could be included instead of the middle option, often discussing having more options than what was originally there, such as various other time increments.

If you would like to discuss these findings further, please email

Further information


We are always interested in hearing from users so we can develop our work. If you use or produce statistics based on this topic, please email

  • If you would like us to get in touch with you then please leave your contact details or email directly.
  • This field is for validation purposes and should be left unchanged.