Principal Data Engineer â Core Data Platform
You can work with a fast-growing, Series-B product company that builds a developer-first platform for managing and securing software artifacts. Basically, a single place teams store, scan and deliver packages, dependencies and AI models with enterprise-grade security and policies. The platform supports dozens of formats, scans for vulnerabilities, enforces policy, and runs at real web-scale for engineering teams worldwide.
Youâll own the data platform that powers customer insights, policy decisions and the AI features that sit on top of all that telemetry. Itâs heavy, meaningful work: event-driven ingestion, spectral analytics back-ends, and pipelines that turn huge volumes of build and usage data into actionable product features. Youâll be helping shape a data stack that directly affects how thousands of engineering teams ship safely.
Cloud-native AWS (Python, S3, Kinesis/Flink, Lambda, DBT, ClickHouse, Postgres).
Design and run scalable, secure data pipelines (streaming + batch), analytics stores and the infra that powers them.
Work closely with product, infra and engineering teams. Youâll ship, operate and own end-to-end.
Focus on observability, cost/latency trade-offs and delivering stuff customers actually use.
Who weâre looking for
Deep data engineering experience on AWS, production pipelines, streaming and OLAP/columnar stores.
Strong Python + SQL; experience with DBT and streaming tools (Kinesis/Kafka/Flink or equivalents).
Comfortable with IaC, CI/CD and platform tooling (Terraform/CDK, Datadog/CloudWatch, testing).
Product mindset and low ego. You prioritise customer impact and can explain trade-offs clearly.
Competitive pay, equity, health benefits, and development budget. If you care about product-market fit and want a chance to shape this platform further, apply here.