👨🏻‍💻 postech.work

Lead Data Software Engineer

EPAM Systems, Inc. • 🌐 Remote

Remote Posted 1 day, 17 hours ago

Job Description

We are looking for a seasoned Lead Data Software Engineer to join our team remotely and support the advancement of our Data Science projects.

In this position, you will take charge of designing and managing datamarts while delivering efficient and scalable data solutions tailored to client requirements. Your responsibilities will include building and maintaining data pipelines, providing operational support, and collaborating with Data Science teams to ensure exceptional outcomes. Additionally, you will lead a team of talented professionals, offering guidance and mentorship to foster innovation and drive impactful results.

Responsibilities

Provide leadership and mentorship to the Data Software Engineering team, encouraging growth and continuous learning

Collaborate with diverse teams to deliver data solutions that align with project goals and timelines

Develop and improve Data Software Engineering processes focused on automation, efficiency, and innovation

Design and implement scalable data solutions using AWS infrastructure

Enhance workflows and data processing through the use of tools like Apache Airflow and Apache Spark

Stay updated on industry advancements and integrate best practices into engineering strategies

Offer operational support for data pipelines and datamarts to ensure smooth performance

Oversee the development and deployment of REST APIs to facilitate effective data integration and communication

Work directly with clients to gather requirements and deliver customized solutions

Manage team structure and processes to ensure timely and effective project delivery

Collaborate with stakeholders, demonstrating excellent leadership and communication skills

Requirements

A minimum of 5 years of experience as a Data Software Engineer, with a strong focus on large-scale and complex data infrastructures

At least 1 year of experience in leading and managing a team of engineers

Comprehensive knowledge of AWS services, including the ability to design and implement scalable data solutions

Advanced experience with Apache Airflow and Apache Spark for optimizing workflows and data processing

Proficiency with CI/CD tools like Jenkins for streamlining pipeline delivery

Strong skills in Python and SQL for developing and maintaining data pipelines and ETL processes

Hands-on experience with Databricks and PySpark for data processing and analysis

Familiarity with REST APIs to enable seamless data integration and communication

Excellent analytical and problem-solving skills for navigating complex technical environments

Strong client-facing abilities to ensure alignment and collaboration on project objectives

Exceptional organizational and leadership skills for managing efficient project execution

Upper-intermediate English proficiency for effective communication with teams and stakeholders

Nice to have

Familiarity with Redshift for data warehousing and analysis

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.