👨🏻‍💻 postech.work

Data Engineer

Koda Staff • 🌐 In Person

In Person Posted 8 hours, 41 minutes ago

Job Description

DATA ENGINEER

The Data Engineer is an important player in organization-wide data transformation projects. You will be responsible for developing and automating data processing pipelines for data modeling, analysis, and reporting from various data sources. The primary responsibility of this role is to help set up the Delta Lake architecture and deliver data-driven solutions.

Job Description

Build data flows for data acquisition, aggregation, and modeling, using both batch and streaming paradigms.

Develop high-quality code for the core data stack, including the data integration hub, data warehouse, and data pipelines under the Azure environment.

Collaborate with other developers as part of a SCRUM team to ensure overall team productivity.

Perform data analysis, data profiling, data cleansing, data lineage, data mapping, and data transformation.

Assist in designing, developing, documenting, and implementing end-to-end data pipelines and data-driven solutions.

Provide technical support for data-related issues with recommendations and solutions.

Critically analyze information needs.

Help define KPIs, set up monitoring, and implement alerting at the data level.

Profile

You hold a Bachelor’s or Master’s degree.

You have at least 3 years of experience in a similar role.

You are strong in detailed analysis but can also translate your findings into clear, practical synthesis and implementation.

Knowledge and/or experience with the following:

Must have:

Microsoft Azure Data Platform (Azure Delta Lake, Databricks, Azure Data Factory, Event Hub, Debezium)

Must have:

Expertise in building ETL and data pipelines on Databricks using data engineering languages such as Python and SQL on Azure On-Premise Microsoft SQL Server (including Transact-SQL, stored procedures, Analysis Services, indexing, etc.)

Must have:

Azure DevOps – CI/CD implementation, automation of Azure Data Factory deployments

BI concepts and implementation — preferably star schema modeling

You have a good understanding of the Microsoft Fabric suite and can critically compare this technology with alternatives such as Databricks within a modern data architecture.

You can work independently.

You are collegial and contribute to team thinking.

You are capable of taking a critical perspective and discussing substantive topics (related to management information) at all levels within the organization.

You handle deadlines and priorities well and work in an Agile way.

Must be fluent in French

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.