👨🏻‍💻 postech.work

Senior Data Platform Engineer

PRIMUS Global Solutions (PRIMUS UK & Europe) • 🌐 In Person

In Person Posted 8 hours, 31 minutes ago

Job Description

Job Title: Senior Data Platform Engineer

Location: Zurich, Switzerland

Type: Full-Time \| Permanent

Overview:

We are looking for a

Senior Data Platform Engineer

to join our growing team focused on delivering resilient, secure, and scalable

data platform solutions

. If you're passionate about

DevSecOps

,

big data architecture

, and

automated data pipelines

, and want to drive end-to-end platform reliability, this role is for you.

Key Responsibilities:

Operate and maintain core

Global Data Platform

components, including

Kubernetes

,

Kafka

,

virtual servers

, and

big data applications

like

Apache stack

,

Collibra

, and

Dataiku

.

Drive the

automation

of infrastructure,

security components

, and

CI/CD pipelines

to streamline

data pipeline (ETL/ELT)

execution.

Build resiliency into

data pipelines

using robust platform health checks, monitoring systems, and alert mechanisms.

Enhance

data delivery

in terms of

quality

,

accuracy

,

recency

, and

timeliness

.

Collaborate cross-functionally with

security

,

cloud operations

, and

engineering

teams to ensure architecture alignment.

Continuously improve platform performance by analyzing system alerts, issues, and incidents.

Stay updated with

industry trends

and emerging technologies in the

data platform

and

DevSecOps

ecosystem.

Required Experience:

5+ years of experience designing and operating

large-scale distributed systems

.

Proficiency in integrating both

streaming

and

batch data pipelines

(e.g., Kafka).

Hands-on experience with

DevOps

and

data pipeline automation

using tools such as

Jenkins

,

Octopus

,

Ansible

, or

Chef

.

Deep understanding of

on-prem big data architecture

;

cloud migration

experience is a plus.

Working experience with

data science platforms

such as

Dataiku

.

Familiarity with

Agile

methodologies (Scrum, SAFe).

Supporting analytics workflows from

enterprise reporting

(e.g., Tableau) to

machine learning and ML Ops

.

Key Skills:

Strong hands-on knowledge of

data platforms

, including

data lakes

,

data meshes

,

delta lakes

, and

data lakehouses

.

Expertise in distributed technologies like

Kafka

,

Kubernetes

,

Spark

, and

S3/Parquet

.

Advanced programming skills in

Python

,

Java

, or other languages like

Scala

or

R

.

Experience in

Linux/Unix scripting

,

firewall configuration

, and infrastructure templates (Jinja, Puppet).

Skilled in managing

Kubernetes pods

, scaling

VMs

, and deploying

Docker images

via

CI/CD pipelines

.

Knowledge of

Harbor

, image registries, and platform security controls.

Understanding of the

financial services sector

is an advantage.

Higher education in IT or business informatics (e.g., Fachhochschule, Wirtschaftsinformatik).

Excellent command of

English

;

German

is a plus.

#DataPlatform #DevSecOps #Kubernetes #Kafka #BigData #DataEngineering #CloudMigration

Apply now to join our mission to build next-generation

data platforms

that power actionable insights at scale

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.