My career in analysis: Joanna Lee
This blog is part of our Analysis in Government (AiG) Month 2022 series called ‘My career in analysis’. Throughout the month we will be sharing blogs from colleagues about their career journeys. Each contributor will also share their best pieces of advice for aspiring analysts.
As the world, and government, move toward being more data and evidence driven, we as analysts are increasingly faced with a very important question: when is some imperfect evidence better than none, and when is it not?
Before joining the Civil Service, I completed a PhD in computational biochemistry. This training set my expectations of hypothesis driven analysis and well-defined problems to explore. These principles are still the basis of all my analytical work. However, I have developed nuance and flexibility in my approach since joining the Civil Service.
I began my career in the Civil Service as an Executive Officer (EO) data scientist, joining the Civil Service in their first ever external recruitment for data scientists. It was a new, exciting role with lots of freedom in which research methods I applied. Since then, I have learnt what a data scientist can do for government. I have taken several data science roles across lots of different Whitehall departments. I have worked in operational departments, foreign affairs, and in regulation. I now work in a central government data science team, 10DS, working closely with policy and delivery teams.
When I started, I was continually assessing my work against a perfect, academic standard. Whilst this is important to keep in mind, I have found that it’s nearly impossible to achieve. What stayed with me were the missed opportunities to use imperfect data to improve, even if marginally, our understanding of how policies and public services impact people’s lives. This is partly because most of the administrative data we work with has greater limitations than data collected through academic studies. This results in frequent missing cases or methodological changes because of the nature and realities of collecting information in real operational and service settings like schools, hospitals, and prisons. This differs from the more controlled academic environment.
Around a year ago one of my colleagues commented that analysts often damage our effectiveness by letting perfection be the enemy of the good. This thought rung true, and has stayed with me. I’m especially aware of this when I’m working on laying out evidence in a compelling format that allows my policy colleagues to make informed decisions or challenge their pre-existing biases.
Of course, decision making cannot come to a halt in the absence of data. Historically expert policy makers have made decisions based on experience and instinct, and that still happens now in a wide range of cases. But we are missing an opportunity to improve outcomes if we do not try to offer evidence to support decision making wherever we can.
Showing an incomplete picture and being clear about limitations can be incredibly powerful. It often makes the evidence we do have more compelling. Most of our tools and models are built using imperfect data and by making several assumptions. We should not hide behind this uncertainty, but instead accept what these tools and models are: clever approximations of our world that allow us to explore how we can improve our reality. Let’s face it, nothing in life is perfect and we can never achieve perfection. Often what we can achieve is something meaningfully better, or at least less biased than individual intuition.
As soon as I applied this mentality to my work, I found it freeing. It made it easier to discuss the limitations of my models, and to champion the parts I knew were based on good evidence. It also gave me the space to think about how my stakeholders will use this evidence. This means that I can change how I present the information to make their decision-making process easier.
We, as analysts, can be the people to spot an opportunity. We might discover a piece of important information that prevents the outcome we expect. Our policy colleagues may be able to use this information to innovate, explore new opportunities, and improve public service delivery. Ultimately, that’s why I am here. I want to improve services for the public by providing and the best evidence I can on what works. By using this evidence, we can help to improve outcomes for the public.