👨🏻‍💻 postech.work

Data Engineer

Data Idols • 🌐 In Person

In Person Posted 1 week ago

Job Description

Data Engineer

Contract - £450 - £550pd Location: London - 2 days per week

We are currently looking for an Data Engineer to join our fast-paced, data-driven tech team within a global digital media environment. You'll play a crucial role in shaping how the business collects, models, and activates data across multiple commercial and editorial functions. In this hands-on role, you will architect and build scalable data pipelines, optimise infrastructure, and deliver high-value insight tools that empower decision makers. The Data Engineer will work closely with commercial, product, and analytics teams, making this position vital to the organisation's long-term data strategy.

The Opportunity

As an Data Engineer, you'll combine engineering, analytics, and product thinking to create a reliable, high-performing data ecosystem. You'll have the opportunity to work with modern cloud technologies, large-scale datasets, and a business that truly values data as a strategic asset.

Key responsibilities include:

* Building, operating, and optimising end-to-end ETL/ELT data pipelines using APIs, SFTP, and containerised orchestration tools.

* Developing scalable and well-structured data models that support commercial, programmatic, and affiliate revenue functions.

* Managing and improving complex data infrastructure that processes high-volume, multi-source Big Data.

* Creating, maintaining, and enhancing interactive dashboards that drive KPI-focused decision-making.

* Owning data quality, ensuring accuracy, consistency, and reliability across all core datasets.

* Analysing campaign, monetisation, and platform performance and providing actionable insights.

* Collaborating with Operations, Sales, Marketing, Finance, and Senior Analytics teams.

* Supporting strategic projects with advanced data modelling and insight generation.

This role stands out because it sits at the intersection of engineering and commercial impact, your work directly supports revenue optimisation, global reporting, and critical business decisions.

Skills and Experience

* Strong Python and or Pyspark

* Experience with cloud technologies such as GCP (BigQuery, Compute Engine, Kubernetes) and AWS (Redshift, EC2).

* Experience building ETL/ELT pipelines and working with APIs or SFTP integrations.

* Understanding of data modelling, warehousing, and Big Data environments.

* Strong analytical and creative problem-solving skills.

* Ability to manage projects and collaborate effectively in a team.

Experience creating util packages in Python

If you would like to be considered for the role and feel you would be an ideal fit with our team then please send your CV to us by clicking on the Apply button below.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.