The hidden heroes of data innovation: why data engineers deserve more love

Jason Stolborg-Price

Let’s be honest, in the fast-paced world of data, analysts and data scientists get most of the limelight. They bring innovative data approaches to policy decisions, present dynamic and interactive data tools, and unveil the insights that make headlines. But in the slightly paraphrased words of the Eurythmics; behind every great analyst, there has to be a great data engineer.

Data engineers are the unsung heroes of the data revolution. As an analyst-turned-engineer and former under-appreciator of data engineering myself, it was all too easy to focus on the cool outputs I was producing without ever wondering how it worked behind the scenes. But data engineers are the solid foundation that underpins tech like AI, APIs and real-time visualisation, and without that solid foundation, data usage within government would start showing the cracks very quickly.

What exactly is a data engineer?

If you’ve not encountered a data engineer before, you’re not alone! Data engineering is a relatively new profession, a product of the cloud computing revolution where organisations started to appreciate the power and value of big data for the first time. Essentially, we work behind the scenes to build the infrastructure that lets other people work with data.

This includes processes like building data pipelines; bringing your data into the organisation in a safe, secure and efficient way, even if that data runs to many terrabytes in size. We also manage modern, large-scale data storage solutions, using tools like data warehouses and lakes to allow for storage and processing of data at pace. We also maintain the platforms where analysts run their code, literally maintaining the foundations of reproducible analysis in R or Python, and help them to share that data through APIs.

Beyond that, we maintain the quality and value of our data, carrying out checks and tests, storing metadata, and quietly spotting issues in systems before they ever reach the end user.

We don’t just “clean data.” We engineer trust into the whole data process, beginning to end.

It’s not magic. It’s engineering.

When data tools and systems run flawlessly, it’s very easy to gloss over what’s happening behind the scenes. But imagine a tool which provides a filterable, real-time view of footfall at large public events. At the front end, this allows crowd management by operational teams, allowing people to move safely and without unnecessary delays. However, behind the scenes a data engineer has achieved incredible things. They will likely have brought in data from multiple APIs, ensuring that data is streaming live and with low latency. Data will have been cleaned, tidied, modelled and stored in easy-to-access data warehousing. Checks will have to be performed to ensure data is present, correct, of an expected quality, and where and when it’s needed. And the whole product will have to be resilient, scalable and fast; there’s no room for error with such critical decision making.

The skills required for these tasks are wide-ranging and interconnected. You can expect your local data engineer to be a coding wizard, a cloud legend and a networking fiend, on top of their comprehensive knowledge of data storage, management, standards and versioning.

Big data == Big challenges

The need for data engineering is only growing as the pace of change in the data sphere accelerates, and the expectations placed on data products are higher than ever. The size of data, the speed of delivery, and the pace of producing outputs continues to grow, and the scale and complexity of this means specialist knowledge and tools are required.

Especially in the transport sector, innovations like Digital Twins could fundamentally change the transport experience for the end user. However, these process massive amounts of real-time data to model and visualise these complex systems. That kind of scale isn’t just difficult to manage; it’s impossible without engineering expertise.

As data systems become more ambitious, collaboration also becomes more critical. Analysts bring deep knowledge of the data’s context, quirks, and audiences. Engineers bring the skills to tame complexity at scale. Neither side can deliver real innovation alone, but together, we can build powerful tools that are both technically robust and analytically meaningful.

Toss a coin to your data engineer

If you’ve come away from this with a new-found appreciation of data engineers, take a second to return the favour:

  • Loop us in early on your data projects
  • Approach your pipelines with an open mind; we might have the perfect tool for the job that you’ve never considered
  • Understand our constraints, and tell us yours

And the next time you’re analysing a beautifully clean dataset or presenting insights to stakeholders, take a moment to appreciate the silent pipeline of data behind the scenes that made it all possible.

Because data engineers aren’t just part of the data innovation revolution: we power it.

 


Francesca presented an engaging session at Analysis in Government (AiG) Month 2025 detailing how data engineers are powering the data innovation revolution. You can read the learning outcomes from this session by visiting our Learning outcomes from Analysis in Government (AiG) Month 2025 live events page.


 

Francesca Bryden
Jason Stolborg-Price
Francesca Bryden is the Head of Data Engineering at the Department for Transport (DfT). Her team are currently involved in a wide range of projects to enhance data across the department, including building data APIs, enhancing metadata storage and searchability, and digital transformation of data storage and ingest.