👨🏻‍💻 postech.work

Senior Data Engineer

Crate and Barrel • 🌐 Remote

Remote Posted 2 days, 21 hours ago

Job Description

We are seeking a Senior Data Engineer to serve as a technical lead within our Google Cloud-native data ecosystem. In this role, you will be responsible for the end-to-end engineering of scalable, high-performance data pipelines that power our national retail data platform using a Medallion architecture (Bronze through Platinum). Our Engineering team serves as the organization's core technical provider. You will collaborate closely with separate Governance, Data Science, Data Modeling, and Data Insight teams to turn their requirements and models into robust, production-grade automated pipelines. Your focus is on the "how"—building the high-integrity plumbing that enables these specialized teams to perform their functions while managing complex interdependencies with other technical teams through enterprise orchestration.

This position is fully remote

This role is an Individual Contributor position

A day in the life as a Senior Data Engineer...

Lead the technical construction of complex data pipelines. You will be responsible for the engineering implementation of SCD (Slowly Changing Dimension) methodologies and data movement across retail domains as defined by the Data Modeling team

Manage and maintain enterprise-level orchestration outside of the Google Cloud Platform (e.g., Tidal Automation). You will ensure that data pipelines are synchronized with broader organizational workflows, managing critical interdependencies with other technical teams and legacy systems

Act as the primary engineering point of contact for external teams. You will build and optimize the pipelines required by Data Science for model training/deployment and by Data Insight for reporting, ensuring their technical requirements are met with high availability

Build and maintain high-velocity streaming and near-time pathways using Google Dataflow and Confluent Kafka to support real-time operational needs

Mastery of Dataform (SQLX) to build orchestrated, environment-agnostic transformations. You will ensure all models leverage JavaScript configuration and ref() functions to maintain strict code portability across our 16-project grid

Partner with the Governance team to implement "Privacy by Design" technical controls, including CCPA/GDPR compliance workflows, encrypted hashing, and integration with centralized Google Dataplex registries

Enforce mandatory resource labeling (env, domain, layer) across all BigQuery jobs and GCS buckets to ensure transparent cost attribution for the various teams we support

Lead the team’s Git-flow process via BitBucket. Manage code through feature branches and UAT release collection for deployment into our 16-project production environment

Perform high-quality code reviews and mentor junior engineers, ensuring the team’s output meets the rigorous engineering standards required to support our downstream partners

What you'll bring to the table...

5+ years of dedicated Data Engineering experience, with at least 2 years in a lead or senior capacity

Deep proficiency in BigQuery, Google Dataflow, Google Cloud Storage, and Dataform.

Expert-level Python and SQL are mandatory. Python is our primary language for all custom processing and orchestration

Proficiency with BitBucket (Git-flow), Confluent Kafka, Dataplex, and Enterprise Orchestration tools (e.g., Tidal)

Understanding of PII and how to protect both customer and company data

Bachelor’s degree in Computer Science, Engineering, or a related technical field

We'd love to hear from you if you have...

Experience in a "Data Engineering as a Service" environment, building pipelines to satisfy requirements from Data Science or Analytics stakeholders

Experience working with high-volume retail data (e.g., e-commerce, supply chain, or loyalty)

Familiarity with Java for specific edge-case data processing tasks is a plus

Strong understanding of cloud cost optimization and resource attribution

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.