👨🏻‍💻 postech.work

Snowflake Data Engineer (Ingeniero de Datos Snowflake)

Infosys • 🌐 In Person

In Person Posted 2 days, 17 hours ago

Job Description

Snowflake Data Engineer (Ingeniero de Datos Snowflake)

Location: Mexico (Mexico City, Guadalajara, Monterrey) – Hybrid as per Infosys Mexico policy (CST/EST time) Role Overview

As a Snowflake Data Engineer, you will be designing and optimizing data pipelines and architectures within the Snowflake cloud platform. The role requires strong SQL skills, expertise in Snowflake features such as Virtual Warehouses, Streams, and Tasks, and experience with ETL/ELT processes for efficient data ingestion and transformation. Key capabilities include data modeling, performance tuning, integration with various data sources and BI tools, and ensuring security and governance compliance. Key Responsibilities

Design, build, and optimize data pipelines and ETL/ELT processes using Snowflake.

Develop and maintain DBT models (views, tables, incremental models) including tests for data quality.

Perform data ingestion from AWS S3 using Snowpipe, COPY command, external/internal stages.

Implement performance tuning, query optimization, and Snowflake feature usage (Materialized Views, Secure Views, Streams, Tasks, Time Travel, Zero-Copy Cloning).

Manage version control, branching, and CI-related workflows using GitHub.

Work with AWS services such as S3, Lambda, IAM, EC2 for data movement and transformation.

Collaborate with cross-functional stakeholders (data analysts, data scientists, product teams) to deliver high-quality data solutions.

Troubleshoot pipeline, ingestion, and DBT job issues to ensure reliability and SLA adherence.

Basic Qualifications

Bachelor’s degree or foreign equivalent in Computer Science or related field.

7+ years of IT experience, including data engineering and cloud solutions.

Bilingual (Spanish \& English) – strong verbal and written communication.

Proven experience implementing AWS services such as S3, Lambda, IAM, EC2 for data movement and transformations.

Mandatory Skills

Snowflake

DBT - Developing DBT models

Building scalable data pipelines

Enabling high quality transformations

Integrating data across AWS based architectures

Preferred Skills

Familiarity with CI/CD using GitHub.

Strong problem-solving, leadership, and stakeholder management skills.

Other Requirements

Ability to work under tight timelines and manage complex requirements.

Flexible to work during CST/EST time zones and meet for hand-over activities to India team if needed.

Self-motivated, proactive, and a strong team player.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.