Build, Scale \& Operate Leading DTC Brands alongside A-Players
===================================================================
Maneuver Marketing
Our Vision, Mission \& Success are fuelled by our commitment to be a driving force of positive change to the health of everyday consumers, providing conscious, high-quality \& innovative supplement products.
In just 5 years, we kicked off our own DTC Health \& Wellness brand from scratch and scaled it to USD$100M+ in annual sales, serving more than 3,000,000 customers worldwide with an average of 4,000 daily orders across 9 SKUs.
These results caught the attention of The Financial Times, as they ranked us among APACs top High-Growth Companies. We have also been awarded 2nd place on the E50 Awards, jointly organised by The Business Times and KPMG in Singapore.
This is just the beginning of our journey, and you could be part of the next stage of our growth!
The Role
============
We are seeking a part-time Data Engineer to provide ongoing operational support for our data warehouse infrastructure. This role is focused on data reliability, proactive monitoring, incident response, and continuous platform improvement, ensuring business teams can confidently rely on data for decision-making.
Time Commitment \& Availability
===================================
Expected commitment: 15–20 hours per week (flexible schedule)
Preferred availability: Singapore business hours (9:00 AM – 6:00 PM SGT) for real-time collaboration
Response time expectations:
-
P0 (Critical): Acknowledgement within 2 hours on business days
-
P1 (High): Acknowledgement within 4 hours on business days
-
P2 (Standard): Acknowledgement within 24 hours
Technology Stack
====================
Data Warehouse: Google BigQuery
ETL / Data Movement: Daton (primary), custom pipelines, dbt
BI / Activation: Looker, Qlik, Segment (in progress)
Source Systems: 30+ platforms including Shopify, GA4, Meta Ads, Google Ads, Klaviyo, Loop Subscriptions, Recart, Postscript, PayPal, and others
Monitoring \& Alerting: Slack alerts, custom monitoring framework, email notifications
Core Responsibilities
=========================
Data Reliability \& Pipeline Monitoring
Ensure data pipelines run reliably and data is fresh, accurate, and available as expected.
Monitor, build, and respond to Daton pipeline notifications and alerts
Track data latency, freshness, and completeness across all source systems
Design, build, and maintain QC processors for all source data and custom reports
Monitor job execution, investigate failures, and perform root cause analysis at:
-
Pipeline level
-
QC / validation level
-
API / source system level
Create and enhance data pipelines, onboard new platform integrations, and implement logic changes to existing pipelines
Coordinate with source system owners and vendors when issues originate upstream
Monitor alerts from source systems and custom reports
Ensure overall data reliability through proactive monitoring and validation
Optimize query performance and warehouse costs
Maintain documentation for all logic, schema, and pipeline changes, with a continuously updated change log
Data Quality \& Validation
Implement and maintain automated data quality checks (source + reports) to build trust and confidence in data across the organization.
Key Activities:
Monitor and respond to data quality and test failures
Implement automated validation checks, including:
-
Null checks
-
Duplicate detection
-
Range and boundary checks
-
Valid value checks
-
Referential integrity checks
Implement business-logic validations for key KPIs
Perform daily validation of critical metrics against source UIs (Shopify, GA4, Meta, Klaviyo, Google Ads, Loop, etc.)
Ensure KPI consistency across raw, transformed, and reporting layers
Implement anomaly detection for key tables and metrics
Cost Optimization
Optimize warehouse performance and manage costs proactively to ensure sustainable data operations.
Key Activities:
Monitor and respond to billing alerts for BigQuery, dbt, and ETL tools
Maintain cost monitoring dashboards
Implement and optimize table partitioning and clustering
Optimize incremental loads and expensive queries
Proactively flag high-cost queries via Slack
Query performance optimization (where applicable)
Source System Monitoring \& (API) Integration Management
Proactively manage issues originating from upstream systems and maintain healthy integrations.
Key Activities:
Monitor and respond to source schema and data-type changes
Handle source delays caused by API limits, downtime, or auth failures
Coordinate with vendors and internal teams to resolve upstream issues
Assess business impact and classify incidents as P0/P1 when required
Security \& Compliance
Ensure data access and handling align with regulatory requirements and security best practices.
Key Activities:
Maintain GDPR, CCPA, and related compliance controls
Manage RBAC and column-level security in BigQuery
Ensure PII masking and access restrictions
Respond to security incidents related to data access or credentials
Documentation \& Change Management
Ensure operational continuity and knowledge transfer
Key Activities:
Maintain documentation for pipelines, tables, and business logic
Update test cases for logic or schema changes
Document incident RCA and resolutions
Maintain operational runbooks
Manage logic and schema change requests from business teams
Required Technical Skills
Strong Google BigQuery expertise (SQL optimization, partitioning, clustering)
Experience with ETL tools (Daton, Fivetran, or similar)
Pipeline monitoring and alerting experience
Strong SQL for debugging and validation
E-commerce data experience (Shopify, GA, ad platforms preferred)
Required Professional Skills
Experience maintaining production data systems
Strong troubleshooting and RCA skills
Clear communication with technical and non-technical stakeholders
Proactive, ownership-driven mindset
Ability to work independently in a remote setup
Strong documentation discipline
Nice to Have
Experience maintaining production data systems
Strong troubleshooting and RCA skills
Clear communication with technical and non-technical stakeholders
Proactive, ownership-driven mindset
Ability to work independently in a remote setup
Strong documentation discipline