👨🏻‍💻 postech.work

Data Engineer

South Pole • 🌐 Remote

Remote Posted 2 days, 13 hours ago

Job Description

South Pole is an energetic, global company offering comprehensive sustainability solutions and services. With offices spanning all continents across the globe, we strive to create a sustainable society and economy that positively impacts our climate, ecosystems and developing communities. With our solutions, we inspire and enable our customers to create value from sustainability-related activities.

Job summary:

We're looking for a Data Engineer to own end-to-end data pipelines — from ingesting data from external sources to building clean, reliable data marts for analysts and business teams. You'll work across our ingestion platform (Airflow) and transformation layer (dbt), ensuring data flows reliably from source systems into BigQuery where it powers dashboards, reports, and business decisions.

This is a hands-on role. You'll build and maintain DAGs, write dbt models, implement data quality checks, and collaborate closely with analysts and stakeholders across the business. You'll also have exposure to Elementary for dbt observability and Dagster as we evolve our orchestration layer.

Main tasks \& responsibilities:

Build and maintain Airflow DAGs to ingest data from APIs, databases, SFTP, and SaaS platforms

Develop dbt models to transform raw data into structured, tested data marts

Implement data quality checks using Soda Core and dbt tests

Optimize BigQuery performance and manage data warehouse costs

Support CI/CD pipelines and infrastructure using Terraform and GitHub Actions

Collaborate with analysts to understand data needs and deliver reliable datasets

Requirements:

Essential -

2-5 years of experience in data engineering or a related field

Strong SQL skills - you think in CTEs and know when to optimize

Experience with Python for data pipelines (not just scripting)

Hands-on experience with a modern data warehouse (BigQuery preferred, Snowflake/Redshift acceptable)

Experience with dbt or similar transformation frameworks

Architectural understanding of workflow orchestration tools (Airflow, Dagster, Prefect, or similar)

Strong grasp of data modeling concepts (star schema, slowly changing dimensions)

Hands-on experience with Git, CI/CD pipelines, and collaborative development workflows

Desirable -

Experience with Google Cloud Platform (BigQuery, Cloud Functions, Cloud Run, GCS)

Exposure to Terraform or infrastructure-as-code

Familiarity with data quality/observability tools (Soda Core, Elementary)

Experience integrating diverse data sources (APIs, databases, SFTP, SaaS platforms)

Background in carbon markets, sustainability, climate tech, or trading/financial systems

What we offer:

At South Pole, we care about our employees as much as we care about the planet. South Pole is not just an employer, we are a Team. South Pole does not just offer people a job, we offer you a career. By joining our team, you will find strong purpose and deep meaning in everything you do. You will have the chance to make a real difference for our clients and for the planet, working alongside a passionate team of like-minded colleagues, while building your knowledge/skills and developing your career in a fun, dynamic, international and fast-growing organisation.

We’re a planet of 7.5 billion unique and different people. We all have a contribution to make and South Pole is proud to be an Equal Opportunity Employer. We do not discriminate on the basis of race, religion, colour, sex, gender identity, sexual orientation, age, national origin, marital status or disability. Our recruitment is decided on the basis of qualifications, merit and business need.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.