Role Overview
As a Data Engineer, you will play a vital role in designing, developing, and maintaining robust data pipelines that support analytics, reporting, and AI solutions. You’ll work closely with data scientists, analysts, and business stakeholders to translate complex requirements into efficient, high-quality data systems. You’ll implement automation, enforce data quality, and contribute to the continuous improvement of the company’s data ecosystem.
Key Responsibilities
→ Design, build, and maintain scalable data pipelines for ingestion, transformation, and processing of structured and unstructured data.
→ Collaborate with cross-functional teams to understand business and analytical needs.
→ Optimize data workflows for reliability, scalability, and performance.
→ Implement quality checks, validation rules, and monitoring processes to ensure data integrity.
→ Work with modern cloud environments such as
Azure
,
AWS
, or
GCP
to deploy and manage solutions.
→ Use tools and frameworks like
Databricks
,
Apache Spark
,
Airflow
, and
Terraform
to deliver production-grade data pipelines.
→ Contribute to DevOps practices by applying CI/CD and Infrastructure-as-Code principles.
→ Support real-time and batch data processing initiatives, including streaming technologies such as Kafka or Event Hubs.
→ Document and share knowledge to foster collaboration and best practices within the engineering team.
Candidate Profile
→ Minimum of 2 years of experience as a Data Engineer or in a similar role.
→ Proficiency in programming languages such as
Python
,
SQL
, or
Scala
.
→ Hands-on experience with data pipeline frameworks (e.g., Spark, Databricks, Airflow).
→ Familiarity with cloud platforms (
Azure
,
AWS
, or
GCP
) and their data services (e.g., Azure Data Factory, AWS Glue, BigQuery).
→ Knowledge of data modeling, ETL/ELT processes, and architecture design.
→ Experience with both relational and non-relational databases (SQL and NoSQL).
→ Understanding of data governance, security, and quality management principles.
→ Strong analytical mindset and problem-solving skills.
→ Clear communication skills, with the ability to explain technical concepts to non-technical audiences.
→ Fluent in English; proficiency in Dutch is an advantage.
→ Must already reside in the Netherlands and hold a valid Dutch work permit.
Preferred qualifications:
→ Experience with streaming data tools such as
Kafka
or
Event Hubs
.
→ Certifications in Azure, AWS, or GCP.
→ Familiarity with version control systems like
GitHub
and Infrastructure-as-Code tools such as
Terraform
.