Role: AWS Data Engineer
Location: Pune
Duration: Fulltime Opportunity
Mandatory Skills: Very strong on Python, Pyspark, AWS, Glue, Lambda, S3, Spark
Must Have:
BS/BA degree or equivalent experience
General: Strong organizational, problem-solving, and critical thinking skills; Strong documentation skills
Coding: Proficiency in Python
Cluster Computing frameworks: Proficiency in Spark and Spark SQL
AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Kinesis (or) MSK, Airflow (or) Lambda + Step Functions + Event Bridge
AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager
Good to Have:
Linux Scripting, Jenkins
DevOps: Git, CI/CD, JIRA, TDD
Job Type: Full-time
Work Location: In person