Learning outcomes from Analysis in Government (AiG) Month 2025 live events

Week 1: Commencing 28 April 2025

In this session, Dr Ine Steenmans, Associate Professor in Futures, Analysis and Policy at UCL, challenged the traditional focus on predictive accuracy in policy modelling. Drawing on recent studies, she argued that a model’s greatest impact often comes from how it transforms our thinking, not what it predicts. By valuing exploratory power alongside predictive power, even “wrong” models can generate profound evidence for policy. Below are some of the supporting resources from the session:

In this session Chris Paterson and actuaries from the Government Actuary’s Department (GAD) discussed how climate considerations could affect the public sector. Below are some of the supporting resources from the session:

1. Learn about the services of the Government Actuary’s Department

2. Read related Government Actuary’s Department news stories:

3. Check out related guidance:

4. Take a look at the Actuaries in Government blog, or focus more specifically on their climate change blogs

In this session Professor Sir Ian Diamond, Head of the Analysis Function provided an overview of different aspects to ensure impact, including working better with our customers, the importance of having a clearly defined question to be answered, and the importance of communicating the results effectively. He also covered some of the tools which support impact, including data sharing and linkage. Below are just some of the support resources available:

 

Week 2: Commencing 5 May 2025

In this session, Dr Berkeley Zych, Senior Data Policy Analyst and Operational Researcher in the Department for Science, Innovation and Technology (DSIT) discussed advanced econometric analysis, approaching complex research problems and engaging effectively with academics. Below are some of the supporting resources from the session:

In this session, Paul Matthews, Head of Profession for Statistics in the Scottish Government discussed making an impact by maximising tasks that add most value and minimising those that add least value. Below are some of the supporting resources from the session:

In this session, Department for Work and Pensions (DWP) Library staff shared the impact of two elements of their service; literature searches and online resources. The team shared examples of how the Library’s literature searches of external evidence have been used to inform decision-making within DWP’s Analytical Community. This was followed by a demonstration of how the Library’s introduction of O’Reilly Learning has supported DWP analysts develop their skills with tools like Python and Power BI. Below are some of the supporting resources from the session:

Unfortunately this event was cancelled, keep an eye on our Eventbrite page for updates on rescheduling.

In this session Franca MacLeod and Chris Martin from Children and Families Analysis in Scottish Government discussed the Whole Family Wellbeing Funding. The session focused on the evaluation of the Whole Family Wellbeing Funding, a £500 million multiyear commitment in Scotland to support local transformational system change of holistic family support. Below are some of the supporting resources from the session:

In this session we joined Professor Jackie Carter, Author of Work Placements, Internships & Applied Social Research. Prof of Statistical Literacy. FaCSS, NTF. 1-in-20 Women in Data, University of Manchester for a ‘fireside chat’. Jackie discussed how diversity positively impacts the analytical community and how she has supported diverse talent pipelines. She shared some insights into her work as an educator teaching social research methods and a pioneer of the Data Fellows programme. Below are some of the supporting resources from the session:

In this session Alfie Dennen, Digital Senior Product Manager in AI Enablement for the Department for Business and Trade (DBT) discussed how Redbox is already helping Civil Servants extract, summarise and synthesise information on specialised subjects, and how we can increase the impact of Redbox further in the future. Below are some of the supporting resources from the session:

In this session, analysts from Office for National Statistics (ONS) presented four ways in which ONS staff have had a positive impact through their Science, Technology, Engineering and Maths (STEM) ambassador work.  Attendees were taken through some of the most successful STEM activities and inspired a new wave of STEM ambassadors. Below are some of the supporting resources from the session:

In this session Sam Mold, an analyst working in the Department for Work & Pensions (DWP) discussed the concepts of demography, demographic challenges, and related Government roles. Below are some of the supporting resources from the session:

  • Attendees were informed that the Demography Centre of Expertise in the Department for Work and Pensions aims to:
  1. Ensure that DWP has access to the best possible demographic data and research to support modelling and policy development
  2. Ensure that DWP uses demographic data in an effective, accurate and efficient way
  3. Develop and lead on key briefing for demographic trends
  4. Key products of the team are a ‘Demographics Overview’ slidepack summarising key demographic trends and a ‘Spliced Spreadsheet’ of historic and projected demographic data for use by modellers and forecasters. We store these and other useful demographic data in a ‘Demography Repository’ on Sharepoint for use across DWP

 

Week 3: Commencing 12 May 2025

In this session Dr Karina Williams and analysts from the Office for National Statistics (ONS) discussed three tools that have been developed to assist analysts across the Government Statistical Service (GSS) to assess administrative data quality. Below are some of the supporting resources from the session:

  • The Administrative Data Quality Framework covers what aspects of quality to consider when using administrative data for research and statistical purposes. The framework aims to help you assess the quality of administrative data for use in the production of statistics
  • The Administrative Data Quality Question Bank provides questions for analysts to consider as a quality assessment tool, or to ask data suppliers, to understand quality when conducting analysis or research using administrative data
  • The publication, Cataloguing errors in administrative data; outlines the different types of errors and methods to consider when using administrative data for statistical analysis or research. It gives a theoretical overview providing error classifications and outlining possible ways to deal with errors when using administrative data for statistical purposes

Unfortunately this event was cancelled, keep an eye on our Eventbrite page for updates on rescheduling.

In this interactive session Dr Francesca Bryden,  Head of Data Engineering in the Department for Transport (DfT) discussed all things data engineers. This included how to get the right data to analysts who need it, how pipelines* deliver tangible real-world impacts across DfT, and how DfT are innovating to ensure that analysts can continue to deliver impactful results in a data-centric world. Below are some of the supporting resources from the session:

  • Understand the data engineering role, and how it contributes to both foundational infrastructure and cutting-edge innovation
  • Explore the range of APIs currently produced by the public sector, which wouldn’t be possible without data engineers. Take a look at the API Catalogue
  • See DfT’s future ambition in the area of Digital Twins, cutting edge digital products which rely on structured, scalable, and reliable data flows with the TRIB Roadmap
  • Learn about geospacial digital twins, including what they are and who uses them
  • For more information on data engineers you can contact data.engineering@dft.gov.uk

In this session Levin Wheller, Evaluation Lead in the Cabinet Office, along with speakers from the Cabinet Office Evaluation Task Force (ETF) discussed Randomised Controlled Trials (RCTs). The session explored key learning from the delivery of two ‘wait-list RCTs’ of interventions which aim to support children and families affected by domestic abuse. Below are some of the supporting resources from the session:

In this session Dr Caroline Wright, an analyst and researcher in the Office for Health Improvement and Disparities (OHID) discussed multidisciplinary work in Local Knowledge & Intelligence Service (LKIS) and writing a good impact story. The session provided an introduction to measuring impact, why it is important, and some great examples of the impact LKIS has had across the system. Below are some of the supporting resources from the session:

In this session Simon Marlow (Deputy Director and Lead Analyst for the DWP and DHSC joint work and health directorate) with the help of his analysis team, discussed the impact of Contracted Employment Support. The session examined the innovative analysis that, over the past six years, has transformed the evidence base for employment programmes. Below are some of the supporting resources from the session:

In this session Amy Woodget, a Senior Data Scientist for Natural England discussed the ‘Living England’ habitat mapping project, including how this innovative project is using earth observation imagery, extensive field survey and an AI workflow to map broad habitat types across England. Below are some of the supporting resources from the session:

In this session Aris Xylouris, Head of the Data Policy Analysis Team at the Department of Science, Innovation, and Technology (DSIT) discussed using behavioural science to give consumers greater control. Aris detailed the ‘privacy paradox’ in Cookie pop-ups, where users prioritise speed and ease by opting for automatic acceptance, while at the same time expressing a desire to safeguard their data. When Ministers considered changing the approach to online browser cookies (GDPR cookie pop-ups), analysts from DSIT worked with policy to produce credible options that they could test in a randomised control trial (RCT). Aris discussed how this work highlighted very unexpected results which led to Ministers to reconsider their course of action. Below are some of the supporting resources from the session:

In this session representatives from the Office for Standards in Education, Children’s Services and Skills (Ofsted) discussed their multiple-award-winning Childcare deserts and oases project. We learned how this project has transformed disparate cross-government and commercial data into novel real-world insights on the everyday experiences of parents seeking nurseries/childminders. Below are some of the supporting resources from the session:

 

Week 4 Commencing 19 May 2025

In this session Alec Waterhouse, operational researcher, data scientist, and Head of Energy, Infrastructure and Markets Analysis in the Department for Energy Security & Net Zero (DESNZ) discussed the Aqua Book. The Aqua Book provides guidance on producing quality analysis, and is the first place to go to understand what high quality analysis looks like. Whilst it is particularly relevant to modelling, the underlying principles are relevant to all analysis.

The Aqua Book is currently being updated and this session explained essential information about the content and how it will help you to ensure your analysis is of the highest quality. Below are some of the additional resources and learning points from the session:

  • Take a look at the current Aqua Book, and the full rainbow of books on offer:
  1. The Aqua Book: A good practice guide to those working with analysis and analytical models
  2. The Duck Book*: Aims to provide a checklist for quality assurance of analytical projects in government
  3. The Green Book: How to appraise policies, programmes and projects, plus guidance on the design and use of monitoring and evaluation before, during and after implementation
  4. The Magenta Book: HM Treasury guidance on what to consider when designing an evaluation
  5. The Teal Book: a comprehensive guide released by the UK government to transform project delivery within government
  6. The Orange Book: Guidance which establishes the concept of risk management
  7. The Rose Book: Guidance on the management of knowledge assets such as intellectual property, research and development, and data, in government

In this session analysts from the Evaluation Task Force (ETF) presented findings from the biggest review of the coverage and quality of evaluation taking place across Government. Molly Scott and James Collis discussed why evaluation matters, what good evaluation looks like in the complex world of major projects, how well major projects are currently being evaluated and what we can do to improve. Below are some of the additional resources and learning points from the session:

In this session Dr Francesco Arzilli, Artificial Intelligence and Innovation Analysis Lead in the DWP’s Digital Group discussed a study evaluating the effectiveness and impact of Microsoft’s Large-Language Model (LLM) AI, Copilot. This evaluation used a mixed methods approach, including an impact evaluation to compare outcomes between staff with and without Copilot licenses, and qualitative research through in-depth interviews with licensed users. DWP’s study aims to estimate Copilot’s impact on task efficiency, job satisfaction, and work quality, and to examine staff experiences and perceptions of its performance and reliability. Below are some of the additional resources and learning points from the session:

In this session representatives from Cabinet Office, University College London (UCL) and Durham University discussed how analytical choices affect conclusions of quantitative analyses. This session explored how “many-teams” analyses – where multiple analysts tackle the same research question – can reveal how different analytical choices can shape results. Drawing on insights from the ARGIE (Analysing the Reliability of Government Impact Evaluations) project, they discussed how analytical transparency can improve decision-making and trust in government evidence.

Anouk Rigterink (Durham University) conducted a live exercise to demonstrate making analytical choices and seeing how these impact findings. The session highlighted practical steps for fostering transparency and reproducibility in government analysis, helping analysts produce more reliable and actionable insights. Below are some of the additional resources and learning points from the session:

  1. Encourage participants to reflect on the many analysis decisions involved in a government impact evaluation.

Such analysis decisions include definition of the comparison group, identification strategy, how to deal with missing data, outcome measure, timeframe that the intervention is expected to take effect, variable re-coding, specification (e.g. inclusion of control variables, what standard errors to use, which matching variables to use). Often, there are multiple defensible choices for each analysis decision.

  1. Familiarize participants with the concept of analysis-dependent results

Analysis-dependent results: many combinations of analysis decisions result in many different analysis paths, each with a different estimation of an intervention’s impact. Typically, we only see one or a small selection of these possible impact estimates.

Links to show that this is a problem in practice:

  1. Introduce participants to different solutions to mitigate analysis-dependence

These include pre-registration, reproducible code, synthetic data, multiverse analyses, specification curves, crowdsourced data analysis, code peer review, adversarial replication and outcome-blinding.

On pre-registration:

On specification curves:

  1. Introduce participants to the ARGIE project (Analysing the Reliability of Government Impact Evaluations), and offer them the opportunity to participate.

If you would like to take part in the many-teams analysis, either by writing a script (approx. 0.5 day time commitment) or by answering a survey (approx. 30 minute time commitment), please register your interest here free of any obligation:

If you have suggestions for impact evaluations to include, or would like to speak to us about ARGIE, please email jack.blumenau@cabinetoffice.gov.uk.

In this session James Taylor and Karen Walker from the Defence Science and Technology Laboratory (Dstl) discussed the ‘Discovery project’. The project has automated parts of the horizon scanning process, tackling the conceptually and technically challenging problem of filtering vast daily outputs of academic papers and scientific news.

Using innovative techniques like topic modelling and large language models (LLMs), the team developed a pipeline to automatically sift hundreds of thousands of articles monthly. This transition from an entirely manual process now provides analysts with a richer, more relevant set of documents for triage, enabling high productivity despite reduced resources. Below are some of the additional resources and learning points from the session:

In this session Emily Stirling & Ewa Zabicka from the Advisory, Conciliation and Arbitration Service (Acas) discussed deploying an innovative online community to evaluate a new e-learning package. The session showed how the community works, along with key findings. Emily and Ewa talked through how this community solved some common issues for researchers, like scheduling participants, quick and accurate transcription, and looping participant comments back to earlier participants, allowing all participants to build on what others say.

They also talked through some of the challenges of integrating a novel methodology, including getting stakeholders on board; the pros and cons; and some tips for anyone interested in doing something similar. Below are some of the additional resources and learning points from the session:

In this session, Sian-Elin Wyatt, Explore Subnational Statistics Lead Analyst for Office for National Statistics (ONS) discussed the new Explore Local Statistics (ELS) service. This service enables people to find, visualise, compare, and download subnational data, accessibly presented to both the public and local policymakers. Visualisations show how areas compare with other local authorities across topics including health, education, and the economy. Below are some of the additional resources and learning points from the session:

In this session Michael Dale, (Principal Research Officer), and Sophie Rowson (Research Officer) in the Department for Education (DfE) discussed ‘impact’ in analysis and research. In the session they led an interactive discussion exploring what ‘consequential analysis and research’ means and looks like in practice across different analytical professions and government departments. Below are some of the additional resources and learning points from the session:

In this session Andrew Etherington, Regional Senior Analyst in the Office for National Statistics (ONS) gave an insight into ONS Local, which is an analytical advisory service for local leaders, with dedicated analysts based across the UK.

This session covered how this service is helping to improve access to subnational data, statistics, and analysis across the UK and in the devolved administrations to inform evidence-based decisions locally.  Below are some of the additional resources and learning points from the session:

 

Week 5: Commencing 26 May 2025

Oops! Looks like you are a little early, we haven’t run this event yet

Oops! Looks like you are a little early, we haven’t run this event yet

Oops! Looks like you are a little early, we haven’t run this event yet

Oops! Looks like you are a little early, we haven’t run this event yet

Oops! Looks like you are a little early, we haven’t run this event yet