👨🏻‍💻 postech.work

Data Engineer

GoGoX • 🌐 In Person

In Person Posted 6 days, 11 hours ago

Job Description

We are seeking a highly motivated and experienced Data Engineer to join our technology team in Hong Kong. In this critical role, you will be responsible for building, optimizing, and maintaining our enterprise-level data infrastructure, spanning both streaming and batch processing. You will leverage Python to solve complex engineering challenges, working closely with Data Scientists and Product Analysts to ensure data accessibility and reliability for our product applications and business intelligence needs. The ideal candidate possesses strong proficiency in Python, expert knowledge of SQL and data orchestration tools, and a passion for operational excellence using DataOps and MLOps principles.

What You'll Deliver

Build/Maintain/Optimize

robust

ETL jobs

and high-throughput pipelines for both

streaming

and

batch

data processing.

Help to conquer complex

engineering problems

encountered within the data science and analytics domain.

Develop and productionize

data-driven product applications

and API services in close collaboration with data scientists and analysts.

Design and manage data schemas and structures across various database systems to ensure performance and integrity.

Utilize and maintain modern orchestration tools (like Airflow) to manage complex data workflows.

Implement containerization (Kubernetes) for deploying and scaling data services reliably.

Promote and implement

DataOps

and foundational

MLOps

practices for automated deployment and monitoring of data services.

Ensure data quality, governance, and security across all managed pipelines and platforms.

Participate in code reviews and advocate for engineering best practices.

Troubleshoot and resolve performance issues within the data platform and pipelines.

Who You Are

Proven professional experience as a

Data Engineer

or a similar role focused on data infrastructure.

Expert-level proficiency in

SQL

and deep experience working with various databases (relational, NoSQL).

Strong programming background and proficiency exclusively in Python.

Hands-on experience with workflow orchestration tools, specifically

Apache Airflow

.

Familiarity with containerization and orchestration technologies like

Kubernetes

.

Experience in designing, implementing, and optimizing large-scale ETL pipelines (batch and streaming).

Excellent communication skills in

English

and

Cantonese

(written and verbal), with the ability to collaborate effectively in a team environment in Hong Kong.

Strong problem-solving, debugging, and analytical skills.

Ability to write clean, maintainable, and well-documented code.

What We Offer

A diverse, multi-cultural team that values collaboration and innovation.

Opportunities for continuous learning and professional development in modern data engineering tools.

The chance to build the data foundation that powers impactful business decisions.

Contribute to the growth of a leading technology-driven logistics platform in Asia.

Work on an exciting and high-impact product used by thousands of users.

Opportunity to work with cutting-edge technologies and shape data infrastructure.

Dynamic and international work environment.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.