👨🏻‍💻 postech.work

Data Engineer

Aeris • 🌐 In Person

In Person Posted 3 days, 20 hours ago

Job Description

About Aeris Communications Inc.

For more than three decades, Aeris has been a trusted cellular IoT leader enabling the biggest IoT programs and opportunities across Automotive, Utilities and Energy, Fleet Management and Logistics, Medical Devices, and Manufacturing. Our IoT technology expertise serves a global ecosystem of 7,000 enterprise customers and 30 mobile network operator partners, and 90 million IoT devices across the world. Aeris powers today’s connected smart world with innovative technologies and borderless connectivity that simplify management, enhance security, optimize performance, and drive growth.

Job Summary

As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines and architectures. You will work closely with data scientists, analysts, and software engineers to ensure the availability, integrity, and quality of data for analytical and operational needs.

Location

- Noida, 5 days working on-site

Key Responsibilities

Design, develop, and maintain robust ETL pipelines to collect, process, and store data from multiple sources

Build and optimize data models for analytics, reporting, and operational needs

Implement best practices for data governance, data quality, and data security

Collaborate with stakeholders to understand data requirements and translate them into technical solutions

Integrate new data sources and expand the data platform capabilities

Monitor and troubleshoot data workflows and infrastructure issues

Maintain documentation related to data architecture, data sources, and data flows

Minimum Requirements

Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field

3–6 years of proven experience working as a Data Engineer.

Strong proficiency in SQL,Python, or Scala

Experience with big data technologies, such as Hadoop, Spark, Kafka, etc.

Familiarity with cloud platforms (AWS, GCP, Azure) and cloud-native data solutions

Experience with data warehousing (e.g., Redshift, Snowflake, BigQuery)

Good understanding of ETL concepts, data modeling, and data architecture

Experience with workflow orchestration tools (e.g., Airflow, Luigi)

Excellent problem-solving skills and attention to detail

Strong communication and collaboration skills

Preferred Requirements

Experience in real-time data processing and streaming

Knowledge of machine learning workflows and MLOps

Exposure to DevOps practices and CI/CD pipelines for data projects

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.