👨🏻‍💻 postech.work

Data & AI Engineer

House of HR • 🌐 In Person

In Person Posted 5 days, 5 hours ago

Job Description

ABOUT OUR TEAM

The Data \& AI team plays a

central, scaling role

, supporting the entire House of HR portfolio. We are expanding and looking for an extra

engineer

to strengthen our capabilities in two core areas:

🧠 BUILDING, DEPLOYING, AND OPERATIONALIZING AI PRODUCTS

We partner directly with our PowerHouses' AI champions and field staff to understand their needs and determine how AI systems can help them go

harder, better, faster, and stronger

.

Design:

We architect

end-to-end AI systems

that deliver features with clear business value, covering all necessary technical components.

Develop:

We build central

Python REST APIs

—on top of FastAPI, Pydantic, and LLM providers (mainly OpenAI)—to deliver tailored, AI-powered applications consumable by various end products (websites, bots, ATS systems).

Deploy:

We leverage

containerization, serverless runtimes, and scalable Azure PaaS

components to roll out our products at scale. Everything is deployed through DevOps, CICD, and

Infrastructure-as-Code

.

Own:

We ensure stable, robust, and mature products through best practices in

testing, linting, monitoring, and security

. We align closely with the business to constantly assess the impact of our work.

📊 ... AND DATA PLATFORMS

We are essential to the group's data infrastructure, managing both central and decentralized initiatives:

Event-driven data processing:

multiple of our AI and Data products rely on event-driven architectures, leveraging async processing, jobs, topics and queues.

Central Lakehouses:

We spin up and maintain central

Databricks-based data platforms (lakehouses)

, using PySpark and SQL to ingest, transform, and expose valuable data products for various PowerHouse departments.

Decentralized Enablement:

We help PowerHouses increase their data maturity. We design and build the bridge from their operational systems to their reporting needs, setting up cloud infrastructure, Fabric capacity, Fabric lakehouses, and processing components. This ensures a scalable, lightweight medallion architecture that transforms source data into a usable

Golden Layer

, ready for reporting or integration.

YOUR PROFILE: WHAT MAKES YOU A GREAT FIT?

We are seeking a

Data / AI professional

with a few years of relevant experience working with

Python

in a Data and/or AI setting.

To be a great fit for our gang, you should possess the following core competencies:

A

curious approach

and a

lifelong learning mindset

.

An

analytical spirit

with a good gut feeling for performant data and software systems.

A

good communicator

who can clearly explain what you've built and why.

A

pragmatic individual

who knows when 'good enough' code is sufficient to deliver value quickly, and when deeper optimization is necessary.

TECHNICAL EXPERTISE (MUST-HAVE \& BONUS STARS)

Working experience with Python is a must-have.

Experience with or knowledge of the following technologies will earn you an extra star:

Cloud \& API: Microsoft Azure, FastAPI

Data processing: SQL and/or Spark, Databricks and/or Microsoft Fabric

Deployment: Docker and containerization

WHAT WE OFFER

You won't be doing this alone! You will join a supportive Data \& AI team where you can

learn best practices

, have the space and freedom to

grow

, take

ownership

, and receive coaching on both technical and non-technical aspects.

We offer an attractive salary package, complemented by various fringe benefits.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.