Senior Data Engineer – Snowflake / DBT / AWS Glue
Strategic role in the insurance industry for a skilled data expert. Leverage Snowflake, Python, SQL, AWS, and dbt to build scalable data ecosystems and enable AI capabilities. Enjoy a hybrid environment and influence enterprise-level data architecture and engineering standards across Canada.
What is in it for you:
Salaried: $110-120 per hour.
Incorporated Business Rate: $90-100 per hour.
9-month contract.
Full-time position: 37.50 hours per week.
Hybrid model: 3 days per week on-site, subject to change.
Responsibilities:
Advocate for engineering standards and process efficiencies to ensure high-quality, timely delivery.
Lead the development of complex, high-performance data pipelines using tools like dbt Core/Cloud.
Design and review conceptual, logical, and physical data models based on business and technical requirements.
Own and maintain robust, reusable code in SQL, Python, Shell, and Terraform.
Develop detailed low-level engineering solution design documents aligned with technical standards.
Create and review data test plans to ensure solution quality.
Promote the use of data catalogs, data governance, and data quality practices.
Conduct root cause analysis and implement effective solutions for technical data issues.
Lead scrum ceremonies and foster an agile mindset across the team.
Mentor and support junior data engineers to elevate engineering practices.
Collaborate across functions to deliver customer-centric data products.
Drive technical presentations and offer constructive feedback on data designs and processes.
Plan and execute data release activities for smooth, high-performance delivery.
Support talent acquisition through interview design, participation, and hiring decisions.
What you will need to succeed:
Bachelor's degree or higher in Computer Science, Engineering, or related field.
8+ years of professional experience in data engineering with a track record of delivering 3+ full-cycle high-impact data projects.
Certification(s) in any of the following are considered assets:
SnowPro Core
SnowPro Advanced: Data Engineer (DEA-C01 or DEA-C02)
dbt Developer
AWS Cloud Practitioner
Expert-level coding in SQL, Python, Glue, DBT, Shell, and Terraform with focus on maintainability and performance.
Deep expertise in relational (Snowflake, PostgreSQL, Amazon Aurora), big data (Hadoop), and NoSQL (MongoDB) platforms.
Proficiency with data visualization tools: Snow Sight, Streamlit, Qlik, SAP Business Objects.
Experience with data orchestration and pipeline tools such as Zena and AWS Managed Airflow.
High resilience and adaptability in ambiguous or high-pressure environments.
Strong collaboration and communication skills with ability to influence stakeholders and lead teams.
A customer-first mindset driven by data insights.
Insurance industry knowledge is an asset.
Experience with AI/ML model operationalization is an asset.
Why Recruit Action?
Recruit Action (agency permit: AP-2504511) provides recruitment services through quality support and a personalized approach to job seekers and businesses. Only candidates who match hiring criteria will be contacted.
# AVICJP00002846
Job Types: Full-time, Fixed term contract
Contract length: 9 months
Expected hours: 37.50 per week
Benefits:
Work from home
Application question(s):
Are you available to work on-site 3 days per week in Markham?
Experience:
Data Engineering delivering end-to-end projects.: 8 years (required)
Snowflake: modeling, tuning, pipelines.: 5 years (required)
Coding ETL/ELT in AWS Glue (Python, Jobs, Catalog, Workflows: 3 years (required)
Writing advanced SQL and Python for data processing: 8 years (required)
Terraform and infrastructure-as-code: 3 years (required)
DBT (Core/Cloud) for complex transformations and testing: 3 years (required)
Work Location: Hybrid remote in Markham, ON