đŸ‘šđŸ»â€đŸ’» postech.work

Data Engineer

Soho Square Solutions ‱ 🌐 In Person

In Person Posted 23 hours, 45 minutes ago

Job Description

Job Title: Senior Data Engineer

Experience Level:

Level 4 (Advanced) – 7 to 10 years

E

mployment Type:

Fixed‑Term 1- Year Contract

Location:

Montreal, QC

Work Model:

Day 1 onsite onboarding; in‑office presence 3 days per week

Job Description We are seeking an experienced

Senior Data Engineer

to join the Corporate Workspace Technology team, supporting the modernization of enterprise data tooling and enablement platforms.

In this role, you will be responsible for the

design, development, optimization, and management of cloud‑based data warehouse solutions

using

Snowflake

, along with building and maintaining robust data pipelines utilizing

Talend, Informatica, and APIs

.

The ideal candidate is a

self‑driven, collaborative, and strategic thinker

who thrives in a fast‑paced environment and demonstrates strong ownership of data engineering solutions from design through production.

Key Responsibilities

Design, implement, and manage

scalable data solutions

in a Snowflake environment for optimal data storage, performance, and cost efficiency.

Migrate existing data domains and workflows from

on‑premises or relational data stores

to

cloud‑based Snowflake architecture

.

Analyze, identify, and optimize

new and existing data workflows

for performance and reliability.

Define and implement

data integrity, quality, and validation practices

across data pipelines.

Integrate

data governance, metadata, and data science tools

within the Snowflake ecosystem.

Develop and maintain

data models and ETL processes

to support high‑quality data ingestion and transformation.

Collaborate closely with engineering, analytics, and business teams to design and deliver effective data workflows.

Monitor, maintain, and optimize Snowflake environments to

improve query performance and reduce operational costs

.

Contribute to

proofs of concept (POCs), technical documentation, architectural standards, and best practices

.

Participate in

code reviews

, providing constructive feedback to enhance team quality and consistency.

Design and develop

data ingestion pipelines

using

Talend and/or Informatica

, following industry best practices.

Write

efficient SQL and Python scripts

for large‑scale data processing and scheduled automation.

Design and implement

data distribution layers

using

Snowflake REST APIs

.

(Nice to have) Develop

semantic models

in Snowflake and support visualization layers in

Power BI

.

Skills \& Qualifications

Required:

Bachelor’s degree in

Computer Science, Software Engineering, Information Technology, MIS

, or a related field (Master’s preferred).

7–10 years of hands‑on experience in

data engineering, data modeling, and data warehouse development

, preferably on Snowflake.

Strong experience with

Snowflake

, including querying, modeling, and performance optimization.

Hands‑on experience with

Snowflake REST APIs

.

Proven experience with

Informatica and Talend

ETL tools.

Advanced

SQL

skills and strong

Python scripting

experience.

Experience working with

distributed systems and large‑scale datasets

.

Proficiency in

relational databases

(e.g., DB2) with a focus on transformations and analytics.

Strong understanding of

data modeling concepts

, schema design, and cloud data architecture.

Excellent

problem‑solving, communication, and collaboration skills

, with experience working across multiple teams and regions.

Nice to Have:

Experience with

Power BI

and/or

Tableau

.

Experience collaborating with

data scientists

to integrate machine learning models into Snowflake.

Familiarity with enterprise

data governance and metadata management

tools.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.