👨🏻‍💻 postech.work

Freelance Data Engineer (ETL / Data Integration)

digica • 🌐 In Person

In Person Posted 5 days, 5 hours ago

Job Description

Start date: ASAP

Location: Brussels and Flemish Brabant

Freelance Mission Overview

The role focuses on building and maintaining robust data integration pipelines within a modern data architecture. You design and document ETL processes that transform complex business information into reliable physical data models. These models fuel enterprise reporting, data analytics, machine learning, and operational insights. You work across data warehousing environments, ensure performance and quality before go-live, and contribute to Data Vault structures, Data Marts, ODS layers and Data Lakes on SQL and NoSQL platforms.

Responsibilities

Design, implement, and enhance physical data models and end-to-end data pipelines

Gather integration requirements. translate them into efficient data flows aligned with user needs

Guarantee scalability, performance, accuracy, maintainability, and security

Define and promote standards, tools, and best practices

Apply development practices consistently across all deliverables

Maintain and optimize existing workflows

Investigate issues, communicate with stakeholders, and propose resolutions

Produce and maintain clear documentation

Technical Requirements

Must-have

Strong experience with ETL development (preferably DataStage)

Solid knowledge of relational database platforms (DB2, Netezza)

Excellent SQL skills (DDL, advanced querying, optimization, stored procedures)

Experience with IBM DB2 LUW

Experience with data warehouse appliances (Netezza)

Familiarity with Atlassian tools (Jira, Confluence, Bitbucket)

Experience with workload scheduling (e.g., IBM TWS)

UNIX scripting (KSH, SH, Perl, Python)

Understanding of Agile principles

Nice-to-have

Data modeling approaches (Conceptual, Logical, Physical, Dimensional, Data Vault)

Testing methodologies (scenarios, use cases)

Knowledge of ITIL

OLAP concepts

NoSQL ecosystems

Hadoop stack (HDFS, Spark, HBase, Hive, Sqoop)

Big Data environments

Data Science, Machine Learning, AI concepts

Experience with reporting tools (Tableau)

Soft Skills

Collaborative mindset

Strong analytical and problem-solving abilities

Self-driven and proactive

Ability to manage parallel tasks with limited supervision

Fast learner

Clear communication of complex technical topics to any audience

Ability to understand and translate business needs

Service-oriented

Respect for internal processes and change management rules

Profile \& Education

Master’s degree in computer science or equivalent experience

Additional training in technical systems is a plus

Languages

Dutch or French (native level) with good command of the other

Fluent in English

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.