👨🏻‍💻 postech.work

Data Engineer

Insight Global • 🌐 In Person

In Person Posted 12 hours, 11 minutes ago

Job Description

3 month contract + extensions

Hybrid 3x per week in Toronto

Pay: $40-55/hr incorp

*THIS ROLE REQUIRES AN IN-PERSON INTERVIEW THIS WEEK OR NEXT IN TORONTO - PLEASE DO NOT APPLY IF YOU'RE NOT AVAILABLE TO COME ON-SITE FOR AN INTERVIEW*

Must haves

5+ years of

Python development

with strong ETL and data pipeline experience

5+ years of experience with

Snowflake

Experience with

SQL Server, PostgreSQL, and S3 data sources

Hands-on experience with Apache Airflow including DAG authoring, scheduling, and troubleshooting

Proficiency with Docker and container orchestration (OpenShift or Kubernetes)

Experience building and optimizing data pipelines with PySpark

Familiarity with CI/CD tools such as GitHub Actions or similar

Strong Linux and shell scripting skills

Excellent communication and documentation abilities

Job description

Insight Global is seeking a skilled Data Engineer to join our team and support enterprise-level data engineering initiatives. This individual will focus on developing, maintaining, and optimizing data pipelines, working with distributed systems, and ensuring efficient data flow across multiple platforms. The role requires expertise in Python, Airflow, and PySpark, as well as hands-on experience with cloud and containerized environments. The ideal candidate will be a strong problem solver with excellent technical and communication skills, capable of collaborating across teams and delivering high-quality solutions in a fast-paced environment.

Responsibilities

Design, build, and maintain scalable ETL processes and data pipelines

Develop and manage DAGs in Apache Airflow for scheduling and workflow automation

Work with containerized applications using Docker and Kubernetes/OpenShift

Build and optimize data workflows leveraging SQL and PySpark

Collaborate with cross-functional teams to ensure reliable and efficient data delivery

Troubleshoot and resolve issues related to data integration, pipeline performance, and infrastructure

Contribute to CI/CD pipelines and version control processes

Document processes, workflows, and system configurations

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.