About The Job We are seeking a skilled Data Engineer to contribute to a major enterprise real-time data transformation program in Brussels. The project centers on event-driven architecture and will give you the opportunity to work closely with distributed engineering teams across Europe on a mission-critical cloud initiative.
Key Responsibilities
Architect and implement real-time data pipelines leveraging Azure services (Azure Data Factory, Azure Synapse, Azure Event Hubs, Azure Stream Analytics, Cosmos DB).
Design and optimize scalable infrastructure for caching, integration, and API-driven workloads.
Build and manage CI/CD pipelines in Azure DevOps, with Infrastructure-as-Code delivered via Terraform or Bicep.
Develop and maintain services in Python (FastAPI) for API and microservice integration.
Implement monitoring and observability with Azure Monitor, Application Insights, and Log Analytics.
Perform system performance testing and produce technical documentation.
Collaborate with architects, developers, and operations teams across Europe.
Core Skills \& Experience
Proven hands-on experience with Azure cloud services, especially Data Factory, Synapse, Event Hubs, and Stream Analytics (Cosmos DB a plus).
Strong knowledge of event-driven architectures and low-latency data processing.
Proficiency in Python, with experience developing APIs using FastAPI.
Practical experience with Azure DevOps CI/CD and Terraform or Bicep.
Knowledge of monitoring and observability in the Azure ecosystem.
Strong collaboration skills and ability to document and present technical solutions.