👨🏻‍💻 postech.work

Senior Data Engineer (Upto $4000)

BLUE BELT TECHNOLOGY CO.,LTD • 🌐 In Person

In Person Posted 8 hours, 31 minutes ago

Job Description

Introduction

With over a decade of experience in IT and fintech, Blue Belt has become a leading software development company, delivering innovative technology solutions to a diverse global clientele. We specialize in developing web, mobile, payment, and blockchain applications that offer seamless user experiences. Headquartered in Tokyo, Japan, with a state-of-the-art Technology Hub in Hanoi, Vietnam, Blue Belt operates in more than ten countries, including Japan, Thailand, Indonesia, the Philippines, Malaysia, Taiwan, and Brazil. Our team of over 200 professionals brings a wealth of expertise to drive our global operations.

We are seeking a

Senior Data Engineer

to take ownership of building and scaling our data infrastructure. This role is ideal for someone who is deeply experienced with the modern data stack and is hands-on with designing, implementing, and maintaining robust ETL/ELT pipelines. You will play a pivotal role in shaping the company’s data strategy, enabling data-driven decision-making across the organization, and mentoring junior team members.

Job Description

Lead the design, implementation, and optimization of scalable ETL/ELT pipelines across various data sources.

Architect and manage data platforms and warehouse solutions (e.g., BigQuery, Snowflake, Redshift).

Champion best practices in data modeling, pipeline orchestration, and transformation workflows (e.g., with dbt, Airflow).

Collaborate closely with Product, Engineering, and Analytics teams to align data architecture with business goals.

Ensure data quality, lineage, governance, and compliance across all systems.

Drive the adoption of modern data tools and processes, including streaming data (Kafka, Spark), real-time analytics, and observability.

Troubleshoot complex data issues and ensure high reliability and uptime of production pipelines.

Mentor and provide technical guidance to other engineers and data practitioners.

Key requirements for this position include:

5+ years of experience in data engineering, preferably in fast-paced environments.

Bachelor's degree in computer science, Information Technology or other technical field preferred from TOP UNIVERSITY specializing in Information Technology

Proficient in Python, SQL, and data pipeline orchestration tools like Airflow, Prefect, or Dagster.

Hands-on experience with cloud data warehouses (e.g., BigQuery, Snowflake) and data transformation tools (e.g., dbt).

Strong understanding of data warehousing concepts, dimensional modeling, and data architecture patterns.

Comfortable working with large volumes of structured and semi-structured data (JSON, Parquet, etc.).

Familiarity with stream processing technologies (Kafka, Spark Streaming).

Knowledge of CI/CD, version control, and infrastructure-as-code for data workflows.

Excellent communication skills and ability to collaborate cross-functionally.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.