Role: Data Engineer
Type:
Contract-to-Hire (3-month CTH)
Location:
Remote – Mexico
We’re looking for a
hands-on Data Engineer
to design, build, and maintain reliable data infrastructure in a fast-paced environment. This role owns the
end-to-end data pipeline lifecycle
, delivering high-quality, analytics-ready data using
Snowflake, dbt, SQL, and Python
.
What You’ll Do
Build and maintain
scalable ELT pipelines
ingesting data from multiple sources into Snowflake
Develop
modular dbt models
with strong testing, documentation, and code quality standards
Leverage
Snowflake features
(Streams, Tasks, Snowpark) and
Python (3.10/3.11)
for efficient data processing
Implement
data quality and governance checks
to ensure accuracy, consistency, and reliability
Partner with
analytics and business stakeholders
to translate requirements into technical solutions
Continuously evaluate and adopt tools to
optimize and evolve the data stack
What You Bring
Advanced SQL
skills with highly optimized query development
Production experience with dbt
(models, tests, documentation)
Strong expertise in
Snowflake architecture, data loading, and performance tuning
Proficiency in
Python
for data processing and transformation
Solid foundation in
dimensional modeling and data warehousing best practices
Ability to
work independently
, manage backlog, and own delivery end to end
Nice to Have
SnowPro Core or Advanced Data Engineer certification
Experience with
orchestration tools
(Airflow, Dagster, Prefect)
Familiarity with
CI/CD for data engineering
(GitHub Actions, GitLab CI)
Experience handling
semi-structured data
(JSON, XML) in Snowflake