👨🏻‍💻 postech.work

DevOps Engineer – Connected Health Platform

OrgShakers • 🌐 Remote

Remote Posted 2 days, 16 hours ago

Job Description

Location:

Belgium (Hybrid - Liege Belgium)

Compensation:

€65.000 - €75.000

Department:

IT

About the Opportunity

One of our clients, a fast-growing MedTech innovator in the connected-health space, is seeking a Data Engineer

to drive the development of scalable, compliant, and insight-driven data infrastructure supporting both commercial and clinical operations.

This is a hands-on role responsible for designing data pipelines, building robust models, and enabling self-service analytics that accelerate research, product development, and decision-making across multiple business units.

Key Responsibilities

ETL \& Data Ingestion

Design and monitor pipelines ingesting data from multiple internal and third-party systems (e.g., CRM, ERP, e-commerce, telephony, and flat-file sources).

Orchestrate workflows using AWS services such as Glue, Step Functions, and Airflow.

Ensure data accuracy, schema evolution, and full auditability.

Data Modeling \& Normalization

Build dimensional models and maintain a unified data layer across patient, order, and device domains.

Implement privacy-by-design practices, including PII redaction and GDPR compliance.

Version and document all models to support transparency and reproducibility.

Reporting \& Analytics Enablement

Develop and maintain dashboards for business and clinical teams.

Enable self-service analytics through reusable components and scheduled reporting.

Maintain a central data catalog and train stakeholders on data usage and governance.

AI/ML \& Research Support

Prepare and package curated datasets for R\&D initiatives.

Prototype and productionize machine-learning features, ensuring regulatory compliance.

Collaborate with external partners under secure data-sharing protocols.

Business Unit Collaboration

Serve as the primary data liaison for Marketing, Sales, Finance, and Regulatory.

Deliver ad-hoc data requests in a timely and controlled manner.

Proactively surface insights and recommend data-driven improvements.

Qualifications/Requirements

3+ years building production ETL pipelines (dbt, Airflow, Glue, Spark); strong SQL and Python skills

Dimensional modeling, Tableau or equivalent, and data governance best practices

Experience with pipeline monitoring tools and automated testing frameworks

REST/GraphQL integration and secure authentication (OAuth 2.0)

Git-based CI/CD workflows and infrastructure-as-code mindset

Excellent English communication, documentation discipline, and stakeholder management in regulated settings

Nice to Have:

AWS SageMaker, Feature Store, or HealthLake; Amplitude or similar analytics schema; Salesforce SOQL; R for statistical analysis; ISO 13485 or SOC 2 controls.

What’s Offered

o 20 days plus 10 statutory holidays

o Meal Vouchers

o Hospitalisation Insurance plan

o One day's seniority leave for each complete period of five years of service

o 13th Month Bonus

o Telecommunication Allowance

o Remote Work/Office Expense Allowance

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.