Data Engineer – Cloud Platforms (Engineering Data)
We are building a modern data and AI foundation that supports millions of customers every day. As a Data Engineer – Cloud Platforms, you will design and build scalable, cloud-based data products with real impact. You’ll work hands-on with modern technology in a team that values ownership, curiosity, and people over buzzwords.
What you will do
Building data \& AI that truly serves people
We are on a mission to become a highly customer-driven organization.
That ambition is powered by data — but only when data and technology genuinely serve people.
We have ambitious plans to further shape our data \& AI strategy. As a Data / Software Engineer, you will play a hands-on role in strengthening engineering capabilities across data platforms and AI-driven products.
You will join a multidisciplinary Engineering Data team, consisting of engineers from diverse backgrounds and nationalities. The team combines expertise in data engineering, software engineering, and data infrastructure, and works closely together to build products that are actually used.
This is a hands-on engineering role requiring 3–4 years of core experience (as a data engineer, software engineer, or data infrastructure specialist). You will own data products end-to-end: designing, building, and maintaining scalable data and application solutions that power analytics, automation, and machine-learning use cases.
You will work across the full data lifecycle — from ingestion and transformation to deployment and monitoring — always with a focus on reliability, performance, security, and cost efficiency in cloud environments. You collaborate closely with cloud engineers, data scientists, and business stakeholders to turn data into actionable outcomes.
Who you are
You are genuinely pleasant and fun to work with.
You are curious, adaptable, and eager to learn new tools and approaches.
You take ownership of code and platforms, including design, scalability, extensibility, security, and observability.
You collaborate well across disciplines and communicate technical concepts clearly to non-technical audiences.
Key responsibilities
Own data pipelines and backend applications: design, build, scale, and secure them using Python and SQL.
Manage data infrastructure (storage, compute, orchestration) using Infrastructure as Code (e.g. Terraform, CloudFormation).
Apply best practices for CI/CD, testing, observability, and data governance.
Work closely with stakeholders to translate requirements into reliable, production-ready solutions aligned with business goals.
What we believe
We believe in people over processes and take a human-centric approach to technology.
Every application is reviewed personally — no keyword filtering or automated screening.
We ask for a short, honest motivation explaining why this role fits your experience and ambition. Keep it real and practical — written by an engineer, for engineers.
What we offer
Competitive salary with variable bonus
Hybrid working model
Strong pension scheme
Generous vacation policy (30 days full-time + additional holidays)
Flexible holiday redemption options
Subscription benefits
Clear growth and development opportunities
Annual personal learning budget and access to 200+ digital courses
Workshops, learning weeks, team outings, and social events
You bring this
Some qualities can’t be taught — you bring them naturally:
Curious, structured, people-focused, and honest
Calm under complexity
Enjoy improving how things work, not just making them work
These traits are valued across all roles and levels.
Must-have skills
\~4 years experience as a Data Engineer, Software Engineer, or Data Infrastructure Specialist in a complex environment
Strong Python for backend/application development
Solid SQL for data transformation and querying
Cloud experience (AWS preferred, Azure or GCP welcome with willingness to move)
Experience with Infrastructure as Code (Terraform, CDK, or similar)
FinOps-aware cloud design mindset
Experience with CI/CD, containerization, and orchestration
(e.g. GitHub Actions, GitLab CI, Jenkins, Docker, ECS, Kubernetes)
Strong communication skills with technical and non-technical stakeholders
Nice-to-have
Experience with data orchestration tools (Airflow, Step Functions)
Knowledge of data quality and observability tooling
Familiarity with ML workflows (MLflow, feature stores)
API development experience (FastAPI, Flask)