Job description:
1. Migration \& Technical Implementation:
· Lead migration of on-premises and SharePoint-based data infrastructure to AWS Data Analytics Platform (DAP)
· Collaborate with AWS teams to develop comprehensive migration strategies and implementation plans
· Re-architect existing Python scripts and UiPath automation workflows for AWS SageMaker or Glue scripts
· Design and implement robust data pipelines and ETL processes within AWS environment
· Migrate Tableau dashboards to AWS QuickSight whilst maintaining full functionality
· Establish data connections from multiple sources (HRPS, Gateway, SharePoint, Microsoft Lists, FormSG, MS Access)
· Ensure seamless transition with zero disruption to existing data operations
2. Documentation \& Knowledge Transfer:
· Create comprehensive documentation for all new AWS processes and workflows
· Develop training materials and conduct knowledge transfer sessions for end users
· Support HR officers in accessing and utilising dashboards in the new AWS environment
· Conduct thorough testing and validation of migrated systems
Required Skills:
1. AWS Services
· Extensive experience with AWS services (SageMaker, QuickSight, Athena, Lambda, data pipelines).
· Experience using AWS services in Python and Command Line Interface
· Proficiency in serverless architecture design and Lambda function development.
2. Infrastructure as Code (IaaC)
· Experience in Infrastructure-as-Code (eg. CloudFormation, YAML, JSON infrastructure scripts)
· Lambda function provisioning and management through IaaC templates
· Integration of serverless components with traditional infrastructure resources
3. Programming and Automation Skills
· Proficiency in Python programming and UiPath automation
· Experience developing and deploying Lambda functions for data processing and automation
· Understanding of event-driven architecture and serverless computing patterns
4. Data Visualisation and Analytics
· Strong background in Tableau and QuickSight dashboard development
· Experience with ETL processes and data pipeline design
· Knowledge of serverless data processing workflows using Lambda
5. DevOps and Collaboration
· Experience working with DevOps stack (eg. Ship-hats, GitLab, Nexus Repo etc)
· Understanding of CI/CD pipelines for both application code and infrastructure deployment
· Experience with infrastructure change management including security groups and network ACLs
6. Technical writing and documentation skills
7. Training delivery and user support capabilities
Job Type: Full-time
Experience:
Senior Data Engineer: 1 year (Preferred)
AWS services : 1 year (Preferred)
Python programming: 1 year (Preferred)
Tableau: 1 year (Preferred)
DevOps: 1 year (Preferred)
ETL: 1 year (Preferred)