Requirements:
Degree in Data Science, Statistics, Computer Science or a related field.
1 - 2 years IT experiences in data migration or data pipelines projects.
Exposure to containerization/orchestration (Docker, Kubernetes).
Experience of statistical analysis, Linux system, Pyspark, Git and SQL.
Knowledge of ETL tools such as Apache Airflow, DBT.
Familiarity with one of the major cloud platforms Azure (Devops, Databricks, or Data Factory) is a plus.
Responsible for data engineering and analysis in Sales, Manufacturing, Logistics fields.
Discover potential demands and translate requirements into data-driven solutions with stakeholders.
Design, build, and optimize scalable pipelines for ingesting, transforming, and integrating large-volume datasets.
Salary: 30 - 35K
(Tsuen Wan, Monday – Friday, 9:00am – 5:30pm, Medical, 13 months salary, 12 - 21 days annual leave)
工作類型: 全職, 長工
薪酬: $30,000.00至$35,000.00(每月)
福利:
在職專業培訓
有薪年假
有薪病假
晉升機會
產假
醫療保險
Work Location: Hybrid remote in Tsuen Wan, N.T.