👨🏻‍💻 postech.work

Data Engineer

Nagarro • 🌐 In Person

In Person Posted 1 day, 17 hours ago

Job Description

Company Description

👋🏼 We're Nagarro.

We are a digital product engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (18 000+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in!

By this point in your career, it is not just about the tech you know or how well you can code. It is about what more you want to do with that knowledge. Can you help your teammates proceed in the right direction? Can you tackle the challenges our clients face while always looking to take our solutions one step further to succeed at an even higher level? Yes? You may be ready to join us.

Job Description

We are seeking a skilled

Data Engineer

to design, build, and maintain scalable data platforms that support business intelligence, analytics, and data-driven decision-making. In this role, you will collaborate closely with business and technical stakeholders to develop reliable data pipelines and modern data architectures.

Key Responsibilities

Design, develop, and maintain data pipelines and data architectures

Implement and optimize ETL/ELT processes to support analytics and BI initiatives

Collaborate with stakeholders to understand and translate data requirements

Ensure data quality, performance, and reliability across platforms

Support cloud-based data solutions in an Agile/Scrum environment

Qualifications

5+ years of experience in data engineering, BI, or IT roles

Advanced proficiency in SQL.

Strong experience with ETL tools like: DBT, PWC, ODI, Datastage;

3+ years of experience with cloud platforms (AWS and/or Azure)

3+ years of experience with Apache Airflow.

Hands-on experience with Spark-based technologies (Spark, Glue, EMR)

Experience in scalable Python development (PySpark, Spark SQL)

Experience with relational or columnar databases (Oracle, SQL Server, Redshift)

Experience with BI tools such as Tableau, Qlik, or OBIEE

Strong communication skills with English proficiency

Ability to work effectively in multicultural and cross-functional teams

Organized, detail-oriented, and proactive problem solver

Willingness to learn and adapt to new technologies

Additional Information

Nice to Have

Data Warehousing or BI certifications

Experience in knowledge sharing or mentoring

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.