The future of GSS is in the hands of humans

Jade Lester

I was honoured to be invited as closing keynote at the UK Government Statistical Service (GSS) Conference in Manchester in November 2025. GSS Statisticians occupy a critically important space at the intersection of evidence, public trust and national decision-making. Statistics are being reshaped on an unprecedented scale by digital technologies, including AI-assisted modelling, automated quality checks, data-linkage infrastructures, and collaborative platforms that connect disparate datasets. These systems involve statisticians interworking with technology in ways that change how they work, which may impact professional identity.

Often when we think about challenges related to data, we consider technical issues or people and skills challenges, overlooking deeper cultural problems. For the GSS, thinking about cultural issues is important, since digital transformation raises questions around what it means to be a statistician and the ways statisticians will interwork with technology.

Deciding how we leverage AI

Much of the discourse around AI, digital technology and work focuses on jobs being replaced. In reality, many changes associated with using AI involve the automation of tasks. These changes are different from previous forms of automation in subtle ways, because some of the digital systems professionals work with do more than support work tasks: AI helps make decisions about what humans will do.

For example:

  • business analysts now work on “priority insights” surfaced through dashboards
  • gig economy workers receive gig tasks from algorithmic managers
  • chemists investigating drug synthesis are offered potential molecular structures to investigate from robotic systems in mechanised laboratories
  • astronomers have their data gathered and cleaned by automated telescope systems

These changes move beyond AI simply realising efficiency gains and involve a subtle shift of agency from humans to machines.

When technology defines workflows and decisions, it also begins to reshape professional identity and judgement, two characteristics at the heart of statistical practice. In the future, AI systems may determine what:

  • what counts as high-quality data
  • what constitutes an outlier worth investigating
  • what uncertainty needs to be communicated to Ministers
  • what caveats are essential for the public to understand AI

Digital tools can be used to enhance professional judgement, but they can also obscure it. That is why forms of inter-working between humans and technology need to be negotiated by professionals, not automatically adopted by them. When professionals have the opportunity to negotiate how they work, they are more likely to use their agency in positive, pro-active ways, rather than to resist change.

Agency

Studies of how professionals learn highlight that agency – the response of professionals to changes in the workplace – is an important part of digital transformation. Yet, in many public-sector contexts, digital transformation tends to be framed as a technical upgrade: a shift to cloud, a new coding standard, a centralised platform. These technical changes matter, but without taking into consideration how professionals agentically respond to change, there’s a risk that they feel their values and ability to shape their work is constrained.

Through our research (for example FAIR Data Accelerator and the Inclusive Futures), we have talked with a number of professionals working in scientific and data-intensive fields. They describe subtle forms of resistance when they feel new, digital systems are imposed on them in ways that undervalue their expertise. Rather than being viewed as acts of obstruction, these forms of resistance can be considered a signal that identity and new ways of working are in tension.

Tensions arise, for example, when:

  • automation reduces discretion over methodological choices
  • collaborative platforms make it harder to demonstrate individual contributions
  • new forms of practice are replaced
  • automation appears to overshadow the careful reasoning that was previously required

These tensions lead to professionals feeling forms of vulnerability.

Vulnerability

Our work has identified a number of vulnerabilities experienced by individuals as they work with new, large-scale digital infrastructures. For example our work with astronomers highlighted:

  • Concerns about changing practice – With new forms of practices, some felt less like traditional astronomers and more like technicians processing data.
  • Loss of control – As work was automated there were worries about not having autonomy over how data is collected or used.
  • Recognition anxiety – Adjusting to working in distributed networks with remote colleagues from diverse disciplines (engineers, data scientists) led to early-career scientists worrying about gaining appropriate credit in large teams.
  • Erosion of trust – Distributed and cross disciplinary work created difficulty establishing trust when collaboration is indirect or asynchronous.
  • Communication challenges – Working in multidisciplinary settings meant there were differences in terminology and priorities among multidisciplinary teams.
  • Inequality of resources – Unequal access to data and tools led to frustration and perceived unfairness.

A second study we carried out exploring how a wider group of scientists use FAIR data revealed similar concerns, with additional issues such as automation engendering a fear of becoming “data technicians”, rather than a scientific thinker, and collaboration with multiple disciplines introduced new communication problems that were not evenly shared. These vulnerabilities sometimes were unrecognised by leadership teams, who tended to be focused on digital transformation and professional learning through courses and workshop training, overlooking the need to consider human identity and professional values.

These vulnerabilities are not personal shortcomings, but they do provide evidence that something important is at stake. Digital transformation that ignores vulnerabilities risks weakening the collaboration of professionals in making transformation work.

The way forward: digital transformation as a negotiated journey

What might successful digital transformation look like for the GSS? Clearly, implementing the right technical infrastructure and making sure training is available are important components. However, the cultural dimensions should not be overlooked. Our work suggests that five important factors should be considered:

  1. Statisticians should be involved in shaping the use of digital tools, with professional judgement about data quality, uncertainty, and context guiding how systems are built, rather than digital systems making these decisions.
  2. Digital transformation should be a dialogue spanning leadership, technologists, and statisticians, enabled through forms of co-creation rather than a communication plan.
  3. Professional identity should be seen as an asset, not an obstacle, since the identity of a statistician carries with it norms of rigour, independence, and ethics that are vital.
  4. Learning must be multi directional, with statisticians having opportunities to learn new knowledge and skills, and the sector learning the value of good statistical practice from statisticians.
  5. Space for negotiation must be designed into workflows, with team leaders involving teams in decision making.

In summary, statisticians are at the heart of ensuring the GSS is future-facing. However, digital transformation is a complex cultural phenomenon and cannot succeed when it is driven only from the top down.For the GSS, this involves a negotiated, bottom-up process in which statisticians themselves shape the meaning, practice, and direction of their profession.

Without this negotiation, technological change risks undermining the cultural qualities that allows the UK Government to produce trustworthy, independent, and meaningful statistics for the public good. The future of GSS is in the hands of humans, not AI.

Allison Littlejohn, Pro-Vice Provost Grand Challenge Data Empowered Societies and Professor of Learning and Technology at University College London
Jade Lester
Professor Allison Littlejohn is Pro-Vice Provost for University College London (UCL)’s Grand Challenge Data Empowered Societies and is Professor of Learning and Technology at University College London. Her work aims to expand our understanding of the socio-technical effects of digital transformation at work and accelerate UCL’s positive impact on the world through data and information technologies that empower humans. Allison was a keynote speaker at the 2025 GSS Conference - Ahead of the curve: preparing for the future of statistics, which was sponsored by UCL Public Policy.