👨🏻‍💻 postech.work

Senior Data Engineer

Kargo • 🌐 Remote

Remote Posted 3 days ago

Job Description

Who We Are

Kargo creates powerful moments of connection between brands and consumers to build businesses. Every day, our 600+ employees work to radically raise the bar on what agentic AI, CTV, eCommerce, social, and mobile can do to deliver unique ad experiences across the world's most premium platforms. Taking a creative science approach to all we do, we continuously innovate solutions that outperform industry benchmarks and client expectations. Now 20+ years strong, Kargo has offices in NYC, Chicago, LA, Dallas, Sydney, Auckland, London and Waterford, Ireland.

Who We Hire

Techies who want to build the future. Creatives who want to design it better.

Communicators to win business. Collaborators to build it. Data pros who turn numbers into insights. Product builders who turn ideas into innovations. Anyone eager to be on a team that doesn't stop to ask what's next, because they're already building it.

Our Laurels

AdAge Best Places to Work

ThinkLA Partner of the Year

Built In Best Places to Work

Cynopsis 2025 Top Women in Media - Jeannine Shao Collins

Martech Breakthrough Awards - Best Overall Adtech Company

Digiday Media Awards Best Event

Cynopsis Media Impact Awards-Best CTV Platform

Martech Breakthrough Awards-CTV Innovation

Adweek Media Plan of the Year Awards - Best Use of Insights

Title:

Senior Data Engineer

Job Type:

Permanent, Remote

Job Location:

Dublin, Ireland

The Opportunity

At Kargo, we are rapidly evolving our data infrastructure and capabilities to address challenges of data scale, new methodologies for onboarding and targeting, and rigorous privacy standards. We're looking for an experienced Senior Data Engineer to join our team, focusing on hands-on implementation, creative problem-solving, and exploring new technical approaches. You'll work collaboratively with our technical leads and peers, actively enhancing and scaling the data processes that drive powerful targeting systems.

The Daily To-Do

Independently implement, optimize, and maintain robust ETL/ELT pipelines using Python, Airflow, Spark, Iceberg, Snowflake, Aerospike, Docker, Kubernetes (EKS), AWS, and real-time streaming technologies like Kafka and Flink.

Engage proactively in collaborative design and brainstorming sessions, contributing technical insights and innovative ideas for solving complex data engineering challenges.

Support the definition and implementation of robust testing strategies, and guide the team in adopting disciplined CI/CD practices using ArgoCD to enable efficient and reliable deployments.

Monitor and optimize data systems and infrastructure to ensure operational reliability, performance efficiency, and cost-effectiveness.

Actively contribute to onboarding new datasets, enhancing targeting capabilities, and exploring modern privacy-compliant methodologies.

Maintain thorough documentation of technical implementations, operational procedures, and best practices for effective knowledge sharing and onboarding.

Qualifications:

Strong expertise in implementing, maintaining, and optimizing large-scale data systems with minimal oversight.

Deep proficiency in Python, Spark, and Iceberg, with a clear understanding of data structuring for efficiency and performance.

Experience with Airflow for building robust data workflows is strongly preferred.

Familiarity with analytical warehousing such as Snowflake or Clickhouse, including writing and optimizing SQL queries and understanding Snowflake's performance and cost dynamics.

Comfort with Agile methodologies, including regular use of Jira and Confluence for task management and documentation.

Proven ability to independently drive implementation and problem-solving, turning ambiguity into clearly defined actions.

Excellent communication skills to effectively engage in discussions with technical teams and stakeholders.

Familiarity with identity, privacy, and targeting methodologies in AdTech is required.

Nice to have: Extensive DevOps experience, particularly with AWS (including EKS), Docker, Kubernetes, CI/CD automation using ArgoCD, and monitoring via Prometheus.

Follow Our Lead

Big Picture: kargo.com

The Latest: Instagram (@kargomobile) and LinkedIn (Kargo)

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.