👨🏻‍💻 postech.work

Data Engineer

Global X ETFs AU • 🌐 In Person

In Person Posted 3 days, 21 hours ago

Job Description

The Opportunity

Global X ETFS is hiring a

Data Engineer (Snowflake)

to join our Technology team on a

12month full-time contract

, based

on-site in Sydney CBD

. You’ll work hands-on with our Snowflake data warehouse, building the data pipelines and models that power BI reporting and operational automation. This role offers the opportunity to take ownership of meaningful work in a small team, with support and guidance from the Technology \& AI Lead. Strong performers will be considered for permanent roles at the end of the contract.

Global X ETFs Australia is a member of Mirae Asset Group, with a growing range of cost-effective and innovation-led ETF products. For more than a decade, our mission has been empowering investors with unexplored and intelligent solutions.

What you’ll do

Design, build and maintain ELT pipelines and data models in Snowflake using SQL and Python

Develop BI-ready data layers (dimensional models/semantic layers) feeding Power BI dashboards

Build and optimise database objects in Snowflake and PostgreSQL (schemas, views, tasks/streams, performance tuning)

Integrate with external data providers via SFTP, RESTful APIs and other protocols (index providers, fund administrators, market data vendors)

Support and enhance Azure data services (Data Factory, Function Apps, Logic Apps) within the broader data platform

Support business automation initiatives using Microsoft 365 tools such as Power Automate and SharePoint

Troubleshoot data quality issues; contribute to documentation, testing and CI/CD workflows

Collaborate with stakeholders across product, operations and finance to deliver outcomes

What we’re looking for

1–3 years’ experience in data engineering, analytics engineering or a related field

Strong SQL and Python skills for data pipeline development

Experience with relational databases (PostgreSQL preferred) and writing performant SQL

Familiarity with file-based ingestion patterns (SFTP, scheduled file transfers)

Proficiency with Git/GitHub and CI/CD fundamentals

Strong problem-solving, attention to data quality, and clear documentation habits

Confident communicator who collaborates well and knows when to seek guidance

Desirable

Experience with Snowflake (or BigQuery/Redshift/Databricks)

Azure data services experience (Data Factory, Function Apps, Logic Apps)

ELT tooling exposure (dbt is a plus) and dimensional modelling concepts

Power BI support experience

RESTful API integration experience

Background in financial services/fintech or other regulated environments

Interest in ETFs, financial markets or investment products

Comfort using AI-assisted development tools (e.g., GitHub Copilot, Claude Code)

Apply now

via LinkedIn. Include your CV and a short summary of your Snowflake/SQL/Python experience.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.