We are looking for an experienced Data Engineer skilled in building scalable ELT pipelines using Snowflake, dbt or Matillion, and modern cloud data platforms. This role involves designing ingestion frameworks, implementing data models, optimizing performance, and delivering production-ready data solutions for client environments.
You will work closely with cross-functional teams to enable data-driven decision-making and support consulting engagements through clear documentation and high-quality deliverables.
Key ResponsibilitiesData Pipeline \& ELT Development
Design and build scalable ELT pipelines using dbt or Matillion on Snowflake.
Ingest data from relational databases, APIs, cloud storage, and flat files into Snowflake.
Implement data models across staging, intermediate, marts, or medallion architecture.
Apply dbt best practices: modular SQL, testing, documentation, and version control.
Orchestration \& Workflow Management
Use tools like Airflow, dbt Cloud, or Azure Data Factory to schedule and monitor workflows.
Apply CI/CD practices and Git-based collaboration for deployments.
Snowflake \& Performance Optimization
Optimize Snowflake performance: clustering, profiling, materialization, and SQL tuning.
Manage warehouse sizing, data loading, and query performance.
Collaboration \& Consulting
Work closely with data analysts, scientists, and architects to understand data needs.
Deliver client-ready documentation, demos, and solution walkthroughs.
Participate in Agile ceremonies (stand-ups, sprint planning, retrospectives).
Code Quality \& Governance
Develop clean, well-documented, version-controlled code.
Maintain data quality using dbt tests and validation frameworks.
Ensure secure handling of data through RBAC, permissions, and privacy standards.
Required Qualifications
3–5 years of experience in Data Engineering.
2+ years hands-on experience with Snowflake.
Experience with dbt or Matillion (Matillion-DPC preferred).
Strong SQL skills and deep understanding of ELT principles.
Experience deploying dbt models in production.
Experience with Git, CI/CD workflows, and cloud orchestrators (dbt Cloud/Airflow/ADF).
Knowledge of data modeling (Kimball/Dimensional preferred).
Familiarity with data quality checks, documentation, and testing in dbt.
Core CompetenciesData Engineering
Building modular, scalable data pipelines using dbt.
Writing optimized SQL for transformations and Snowflake performance tuning.
Cloud Data Platforms
Strong knowledge of Snowflake features: warehouses, profiling, data loading, optimization.
Experience with cloud storage such as Azure Data Lake, AWS S3, or GCS.
Technical Skillset
SQL (advanced level)
Python (for automation, transformations, notebooks)
Snowflake
dbt or Matillion
Azure Data Factory / Airflow / dbt Cloud
Git \& CI/CD
Security \& Compliance
Understanding of Snowflake RBAC, secure data handling, encryption, and GDPR basics.
Deployment \& Monitoring
CI/CD pipelines for data engineering workflows.
Monitoring, logging, and alerting for pipeline execution.
Soft Skills
Strong communication and ability to present solutions to clients.
Experience working with global teams.
Clear documentation and Agile/Scrum familiarity.
Ability to manage changing priorities in fast-paced environments.
Education
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related fields.
Certifications such as Snowflake SnowPro or dbt Certified Developer are a plus.
Mandatory Skills
Snowflake
dbt or Matillion (Matillion-DPC preferred)
SSIS
SQL, ELT/ETL, Azure Cloud, ADF, Data Warehousing, Data Bricks (preferred)
Job Types: Full-time, Permanent
Pay: ₹1,200,000.00 - ₹1,500,000.00 per year