👨🏻‍💻 postech.work

Data Engineer (Python)

iXceed Solutions • 🌐 Remote • 💵 $1,800,000 - $2,500,000

Remote Posted 1 day, 18 hours ago

Job Description

Role: Data Engineer (AWS + Python)

Location: India

Mode: Hybrid/Remote

Type: Contract

Job Description:

Job Responsibilities

Design, develop, and optimize data processing pipelines using PySpark on AWS Databricks, ensuring scalability, reliability, and high performance.

Implement data ingestion, transformation, and aggregation workflows to support advanced analytics, BI dashboards, and enterprise reporting.

Collaborate with cross-functional teams to build cloud-native data solutions leveraging AWS services such as S3, EC2, and related components.

Write efficient, reusable Python code to automate, orchestrate, and integrate data workflows across cloud and on-premise environments.

Ensure data quality, reliability, and governance by implementing best practices, including validation, error handling, auditing, and logging.

Participate in performance tuning of PySpark jobs, cloud cost optimization, and ongoing enhancement of the data platform.

Mandatory Skills

6+ years of relevant professional experience in the software or data engineering industry.

Strong hands-on expertise in PySpark for distributed data processing and transformation.

Proven experience in Python programming with the ability to build reusable, production-grade code.

Practical, hands-on experience with AWS Databricks for building, deploying, and orchestrating large-scale data pipelines.

Experience processing structured and unstructured data using Spark clusters and cloud data platforms.

Strong understanding of data engineering best practices, including version control, CI/CD for data pipelines, and performance optimization.

Preferred Skills

Working knowledge of SQL for data querying, analysis, and troubleshooting.

Experience with AWS S3 for data storage and EC2 for compute orchestration.

Understanding of cloud-native data architectures, data security, and governance principles.

Exposure to BFSI domain use cases and experience handling sensitive financial data.

Familiarity with CI/CD tools, cloud-based deployment practices, and DevOps methodologies.

Experience with ETL scripting, workflow automation, and data pipeline scheduling.

Knowledge of big data ecosystems and distributed computing concepts.

Job Type: Contractual / Temporary

Contract length: 12 months

Pay: ₹1,800,000.00 - ₹2,500,000.00 per year

Benefits:

Work from home

Experience:

Total: 6 years (Required)

AWS: 5 years (Required)

Python: 4 years (Required)

Work Location: Remote

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.