Within KPMG Singapore, the Information Technology Services (ITS) team is responsible for providing quality IT services and solutions internally to support the business and improve efficiency. You will be part of the Data Office, responsible for defining and implementing the Singapore firm’s Data Strategy. The team works with a vision to empower people at KPMG to harness data across the firm to generate actionable insights and drive value.
We’re looking for a resourceful and trustful team player who is passionate about data and technology and has the spirit to diagnose and solve problems within a complex system. As a Data Analyst, your responsibilities would include but not be limited to:
Responsibilities:
Actively support the implementation and enhancement of enterprise data platform to business to leverage existing and 3rd party data sets in the delivery of services, targeting data quality \& efficiency
Understand KPMG’s overall data estate, IT and business priorities and success measures to design and deliver data architecture capability
Work collaboratively with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand requirements and deliver solutions.
Architect conceptual, logical and physical architecture and data models for enterprise data and analytics solutions using recognized data modelling approaches (e.g. 3NF, dimensional)
Design, develop and maintain data integration and data transformation solutions using Azure Synapse Analytics / Data Factory and Azure Databricks.
Implement and maintain scalable and efficient data pipelines to support data ingestion, processing, and analysis.
Develop ETL/ELT processes to extract, transform, and load data from diverse sources into centralized repositories using medallion architecture.
Deploy and manage data solutions across multiple environments (Dev, QA, STG, Prod) and oversee the entire deployment lifecycle.
Implement best practices in data engineering, including data governance, security, and compliance.
Monitor and troubleshoot data pipelines, addressing and resolving issues to ensure smooth production operations, when required. Oversee batch data processing jobs to ensure timely and accurate data delivery.
Identify and resolve performance bottlenecks in pipelines and processes. Ensure platform SLAs are met through proactive monitoring and optimization
Ensure all platform components, including infrastructure and services, are updated and functioning as expected. Coordinate with infrastructure support teams to schedule platform updates with minimal disruption
⠀
Required Skills:
Degree in Computer Science, IT, Information management, or a similar field
Minimum of 5 years of experience in similar role, with a proven track record of building scalable and performant data platform
Proven experience working on the Azure Data Platform, with hands-on expertise in Azure Synapse Analytics and Azure Databricks
Hands-on experience in designing and managing data warehouse solutions using Synapse (dedicated/serverless SQL pools). Proficient in SQL and python.
Proven experience in creating and orchestrating data workflows, building complex ETL/ELT pipelines, and automating data movement across multiple environments.
Demonstrated ability to troubleshoot complex issues in data workflows and implement effective solutions.
Working knowledge of data ops deployment processes, including configuration, automation, and CI/CD workflows. Familiar with CI/CD tools (e.g., Azure DevOps) and version control systems (e.g., Git).
Familiarity with managing metadata, data lineage, and data cataloguing such as Azure Purview
Azure certifications, such as Azure Data Engineer Associate is a plus.