We are looking for a Senior Data Engineer with expertise in Python and Spark to contribute to building state-of-the-art data platforms on AWS. As part of this role, you’ll be integral to designing, implementing and optimizing ETL workflows for our robust Lakehouse architecture in a hybrid and collaborative work environment.
Ideal candidates will have strong technical expertise, a proactive mindset for solving complex challenges and the ability to collaborate effectively in an agile team. This role also provides an opportunity for experienced Data Engineers to step into a leadership role while continuing hands-on work with cutting-edge technologies.
Responsibilities
Design, develop and maintain scalable ETL workflows and data pipelines using Python and Spark on AWS
Implement data solutions leveraging AWS services such as EMR, AWS Glue, AWS Lambda, Athena, API Gateway and AWS Step Functions
Collaborate with architects, product owners and team members to break down data engineering solutions into Epics and User Stories
Lead the migration of existing data workflows to the Lakehouse architecture employing Iceberg capabilities
Ensure reliability, performance and scalability across complex and high-volume data pipelines
Create clear and concise documentation for solutions and development processes
Mentor junior engineers and contribute to team development through knowledge sharing, technical leadership and coaching
Communicate technical concepts effectively to both technical and business stakeholders
Requirements
Significant experience as a Senior Data Engineer designing and implementing robust data solutions
Expertise in Python, PySpark and Spark, with a solid focus on ETL workflows and data processing practices
Hands-on experience with AWS data services such as EMR, AWS Glue, AWS Lambda, Athena, API Gateway and AWS Step Functions
Demonstrable knowledge of Lakehouse architecture and related data services (e.g., Apache Iceberg)
Proven experience in data modeling for data platforms and preparing datasets for analytics
Deep technical understanding of data engineering best practices and AWS data services
Skilled in decomposing technical solutions into Epics/Stories to streamline development in an agile environment
Strong background in code reviews, QA practices, testing automation and data validation workflows
Ability to lead and mentor team members while contributing to technical strategy and execution
A Bachelors degree in a relevant field or certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics)
We offer
EPAM Employee Stock Purchase Plan (ESPP)
Protection benefits including life assurance, income protection and critical illness cover
Private medical insurance and dental care
Employee Assistance Program
Competitive group pension plan
Cyclescheme, Techscheme and season ticket loans
Various perks such as free Wednesday lunch in-office, on-site massages and regular social events
Learning and development opportunities including in-house training and coaching, professional certifications, over 22,000 courses on LinkedIn Learning Solutions and much more
If otherwise eligible, participation in the discretionary annual bonus program
If otherwise eligible and hired into a qualifying level, participation in the discretionary Long-Term Incentive (LTI) Program
- All benefits and perks are subject to certain eligibility requirements