About the Client
Our client is a leading global technology and consulting organization, recognized for delivering innovative solutions across diverse industries. They are committed to driving digital transformation and enabling businesses to harness the power of cloud and data technologies. With a strong focus on collaboration and excellence, they provide an environment where cutting-edge projects meet professional growth.
About the Role
We are seeking an experienced GCP Big Data Engineer to join a dynamic team working on large-scale data platforms. This is a 6-month contract based in Sydney, offering an exciting opportunity to design and optimize data solutions on Google Cloud Platform. You will play a key role in building robust, secure, and high-performing data pipelines that support critical business insights and analytics.
The Successful Candidate
To excel in this role, you will bring:
7+ years of experience in Data Engineering, with 4+ years on GCP in production environments.
Expertise in BigQuery, Dataform, and Pub/Sub, including schema design, query optimization, and cost/performance tuning.
Strong skills in Python and orchestration tools such as Airflow/Cloud Composer for building resilient ETL/ELT workflows.
Hands-on experience with data security and governance on GCP, including IAM, KMS, policy tags, and PII controls.
Excellent problem-solving abilities, technical leadership, and a collaborative mindset.
Nice to have:
Familiarity with Cloud Run, APIs, CI/CD for data workflows, Terraform, and telecom-centric data models.
What’s on offer?
Contract Duration:
6 months
Location:
Sydney
Rate:
$680 AUD per day
Opportunity to work on cutting-edge GCP Big Data projects with a global leader in technology services.
A collaborative environment that values innovation, diversity, and professional development.