👨🏻‍💻 postech.work

Data Engineer (AI Platforms)

OpenVPN • 🌐 Remote

Remote Posted 5 days, 11 hours ago

Job Description

Are you passionate about building the data foundations for a new generation of AI? We're looking for a skilled Data Engineer to be a major contributor to our company's intelligent future. You won't just be maintaining systems; you'll be at the heart of building, scaling, and deploying the data and AI platforms that will redefine how we deliver data solutions.

This is an opportunity to make a significant impact by transforming our data landscape and enabling cutting-edge AI and agentic workflows.

What You'll Do

Design, build, and optimize robust, scalable data pipelines, leading the migration from legacy systems to our modern, AI-centric platform.

Evolve our data models and schemas to better support complex analytics, AI training, and fine-tuning workloads.

Collaborate with AI/ML teams to productionize models, streamline training data delivery, and support the development of sophisticated agentic systems.

Empower the organization by partnering with BI developers and analysts to design highly efficient queries and unlock new insights.

Champion data governance and compliance, ensuring our data handling practices remain secure and trustworthy as we innovate.

Challenges You'll Help Us Tackle

Modernize Our Data Backbone: Lead the charge in migrating our historical data flows to cutting-edge, AI-driven workflows.

Shape the Future of our AI: Redesign our datasets and schemas to be well aligned for training and fine-tuning next-generation models.

Build the Brains of the Operation: Play an important role in the infrastructure that supports powerful, data-driven Agentic Agents.

Scale with Intelligence: Help us build a data ecosystem that is not only powerful today but is ready for the demands of tomorrow's AI.

Our philosophy is that we are a small, close-knit team, and we care deeply about you:

Competitive pay rates

Fully remote work environments

Self-managed time off

Proven experience (4+ years) in a data engineering role, with a track record of building and managing complex data systems.

Deep expertise in SQL and query optimization.

Hands-on experience with cloud data warehouses and databases, specifically Google BigQuery and CloudSQL (PostgreSQL).

Programming experience with Python or JAVA

A proactive and self-motivated \& managed mindset, perfect for a fully remote environment with a high degree of autonomy.

Excellent communication and documentation skills; you can clearly articulate complex technical concepts to diverse audiences.

The ability to work a flexible schedule and the readiness to respond to occasional off-hours emergencies.

Bonus Points For

AI/ML Tooling: Experience with Google's VertexAI platform.

Programming Languages: Proficiency in Go.

DE tools : familiarity with dbt and airflow

Streaming Data: Familiarity with event-streaming platforms like Apache Kafka.

Streaming Analytics: Real time streaming analytics

DevOps \& Infrastructure: Experience with containerization (Docker) and serverless compute (Google Cloud Run)

Legacy Systems: Experience with Perl or PHP is a plus.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.