Analytics Engineer (GCP / dbt)
Sector:
Telecoms
Location:
London or Reading (2 days in office)
Pay Rate:
ÂŁ467.50 - ÂŁ552.50 (Inside IR35)
Tech Stack:
Google Cloud Platform (BigQuery), dbt (Cloud/Core), SQL
No sponsorship available.
The Role: The Bridge Between Data \& Decisions
In the world of Telecoms, the data is vast, but the insights need to be surgical. We are looking for an
Analytics Engineer
who lives exactly where Data Engineering meets Data Analysis.
You won't just be moving data; youâll be the architect of the "Single Source of Truth." Your mission is to take raw, complex telecoms data and transform it into clean, scalable, high-quality models that power our Finance, Planning, and Commercial teams. If youâre tired of "black box" engineering and want to see the direct business impact of your code, this is the seat for you.
What Youâll Be Doing
Architecting the Truth:
Design and implement scalable business-layer models, marts, and OBTs. Youâll decide whether a dimensional model or a flat OBT is the right tool for the specific business problem.
Owning the Lifecycle:
You take business requirements and turn them into technical reality. Youâll develop, test, document, and deploy via CI/CD, treating analytics code with the same rigour as software engineering.
Enforcing Excellence:
Youâll be the gatekeeper for SQL and dbt standards. Youâll lead code reviews that focus on logic, consistency, and future maintainability.
Security \& Integrity:
Data quality isnât an afterthoughtâit's a critical defect. Youâll implement generic and custom tests while ensuring PII is handled safely (masking, hashing, and access control).
Deep-Dive Problem Solving:
When a pipeline breaks or a number looks "off," youâll trace the lineage from source to BI layer, debugging failures independently until theyâre resolved.
Performance \& Cost Ops:
Youâll hunt for refactoring opportunities in older models and use BigQuery tools to optimize query performance and cloud costs.
Stakeholder Storytelling:
Youâll explain technical "why" in simple "business." You make sure downstream teams know exactly how a model change will affect their dashboards before you hit deploy.
Your Profile
The Must-Haves:
Expert SQL:
You write complex, performant queries and understand the "under the hood" mechanics of cloud data warehouses (specifically
GCP BigQuery
).
dbt Specialist:
Strong hands-on experience with
dbt Cloud or Core
is essential.
Modelling Pro:
You understand dimensional modelling and how to structure data for varied business use cases.
CI/CD Native:
Youâre comfortable with branching, merging, rebasing, and resolving conflicts in a professional development environment.
Analytical Detective:
You have the "lineage mindset"âthe ability to trace data flows backward to find and fix the root cause of an issue.
The Nice-to-Haves:
Experience in
Agile
environments and a deep understanding of the data development lifecycle.
Familiarity with BI tools (like
Tableau
) so you can see the data through the eyes of the end-user.
A background in
Performance Tuning
and cost optimization within GCP.
A "Community" mindsetâwhether thatâs mentoring others or contributing to internal forums and meetups.