Job Details:
Primary address of work: Auckland
Vacancy number: 1
Employment type: Permanent full time
Minimum hours per week: 30 Hours
Maximum hours per week: 40 Hours
Minimum hourly rate (low salary range): $50/hour
Maximum hourly rate (high salary range): $55/hour
Pay frequent: Monthly
Company Overview
Auslink Group is a New Zealand-based logistics and supply chain service provider headquartered in Auckland. Auslink has been committed to delivering reliable, efficient, and customer- focused logistics solutions that support businesses across a
wide range of industries.
Key Responsibilities
1. Data Pipeline Development
Design and maintain automated data pipelines to collect and move data from multiple sources (CRM, freight system, warehousing, etc.) to centralized repositories.
Ensure efficient data flow between operational systems and analytics platforms.
2. Data Modeling and Architecture
Develop logical and physical data models to support logistics operations and analytical applications.
Design and maintain scalable data architecture, including data warehouses and data marts, to enable business intelligence and reporting.
3. Data Storage and Management
Manage cloud and on-premise databases, ensuring efficient, secure, and reliable storage of structured and unstructured data.
4. Data Integration
Integrate data across logistics functions (CRM, communication portals, warehousing, customs clearance, delivery, and mapping) into a unified system.
Develop APIs and data connectors to support seamless system interoperability.
5. Data Quality and Governance
Implement data quality frameworks to ensure completeness, consistency, and accuracy of business- critical data.
Establish and enforce data governance policies aligned with privacy, regulatory, and operational standards.
6. Collaboration and Stakeholder Engagement
Work closely with business analysts, data scientists, software engineers, and logistics teams to understand data requirements and deliver actionable solutions.
Provide expert input on data architecture in cross- functional ICT and business transformation projects.
7. Performance Optimization
Monitor and tune data systems to ensure optimal performance under varying loads.
Anticipate scalability needs and proactively adapt infrastructure to support business growth.
8. Troubleshooting and Support
Identify, diagnose, and resolve issues in data pipelines, ETL processes, and production databases.
Respond to incidents and perform root cause analysis to prevent recurrence.
Job requirement: applicant must meet following requirement to apply for this job
Education and Experience
Bachelor’s degree in Computer Science, Information Technology, or a related field; or
Minimum 3 years’ experience in a data engineering, database specialist, or system architecture role—preferably in logistics, warehousing, or E-commerce sectors.
Technical Skills
Proficiency in database management systems such as MySQL, PostgreSQL, SQL Server, or Oracle.
Strong experience in ETL frameworks, data modeling, and warehousing technologies.
Knowledge of cloud data platforms (e.g., AWS RDS, Redshift, Azure SQL) and data orchestration tools.
Familiarity with Python, SQL, or similar scripting languages for data handling.
Understanding of data governance, privacy compliance,
and security protocols.
To submit your application, click Apply Now!!!