About Us:
Renaissance Info Systems is a technology and digital recruitment agency, connecting contract and permanent professionals with clients across Asia-Pacific. We aim to differentiate ourselves through our level of responsiveness, and our understanding that comes from being an IT recruitment agency from the IT Industry. Our recruiters balance sophisticated and simple inter-personal techniques to assure a strong candidate network.
Know More:
http://www.reninfo.com.au
Pentaho Developer/ETL Developer
Provide ongoing support, monitoring, and optimization of
Pentaho
-based solutions in a production environment.
Oversee daily/weekly/monthly
#ETL
processes, ensuring data integrity and job completion.
Work with both on-shore and off-shore teams to ensure seamless delivery of business-critical data.
Translate business needs into technical solutions and ensure timely incident resolution.
Required Skills \& Experience:
Hands-on experience with
Pentaho Data Integration
(
#PDI
) and
Pentaho
BI.
Strong
SQL
skills and familiarity with relational databases like
SQLServer
,a
Oracle
,
PostgreSQL
.
Experience with production support, batch processing, and troubleshooting in
ETL
environments.
Understanding of data warehousing, ETL concepts, and data modeling.
Role-Data Engineer
Data Engineer
Key Responsibilities
Design, build, and optimise batch and streaming data pipelines using approved frameworks and tooling.
Ensure pipelines are resilient, performant, and cost optimised.
Implement monitoring, alerting, and automated recovery processes.
Apply data quality checks and reconciliation processes to ensure accuracy and completeness
Maintain metadata, lineage, and documentation to support governance requirements.
Adhere to role based access control, data privacy, and security standards.
Contribute to the development and maintenance of ingestion frameworks, transformation layers, and data models.
Implement and maintain CI/CD pipelines for automated testing and deployment.
Optimise data storage and retrieval strategies to meet performance needs.
Work closely with Technical Leads, Data Architects, Analysts, and business stakeholders to deliver fit for purpose solutions.
Participate in code reviews and knowledge sharing sessions to improve team capability.
Explore and adopt new tools, frameworks, and techniques to enhance engineering capability.
*
Identify opportunities to automate and streamline workflows.
Key Skills \& Experience
Technical Skills
Proficient in SQL for data wrangling, transformation, and analysis
Strong programming skills in Python, Scala, or Java
Experience building ETL/ELT pipelines in cloud or distributed environments
Familiarity with cloud platforms and services
Knowledge of data modeling concepts
Experience with data orchestration tools
Familiar with version control and CI/CD pipelines
Understanding of data quality, validation, and observability practices
Soft Skills \& Collaboration
Works independently with minimal supervision on well scoped tasks
Collaborates effectively with stakeholders
Communicates clearly and documents work for others to maintain or build upon
Open to code reviews, feedback, and continuous learning
Able to troubleshoot and resolve data pipeline failures or performance issues Experience
Typically 2‿4 years in a data engineering or closely related role
Has delivered production ready pipelines end to end
Demonstrated ability to improve code quality, reliability, or performance
Experience working with real time or near real time datasets
Skills \& Experience Template
Python
Scala
Java
SQL
Databricks
AWS (Glue, Redshift, EMR, S3)
Azure (Synapse, Data Factory, Databricks)
GCP (BigQuery, Dataflow, Dataproc)
Medallion architecture
Data Vault
Kimball dimensional modelling
Orchestration \& Workflow
Proficiency (B/I/A)
Apache Airflow
DBT
Azure Data Factory pipeline
Regards,
Reshu Seth
Recruitment Consultant
Renaissance InfoSystem
M: +61 478 487 026
E: reshu@reninfo.com.au
W: http://www.reninfo.com.au