Data Engineer
Experience: 6+ years
Location: Bangalore
Work mode: Hybrid
Engagement Type: Contract
Key Responsibilities :
Design, build, and maintain scalable and reliable data pipelines for batch and real-time processing.
Develop and optimize ETL/ELT workflows using modern data engineering frameworks.
Work with large-scale datasets, ensuring data quality, integrity, and performance.
Collaborate with analytics, product, and engineering teams to integrate data solutions into business workflows.
Optimize data storage, retrieval, and processing using cloud-native tools and distributed systems.
Implement best practices for data governance, security, and monitoring.
Required Skills :
Strong expertise in Python, SQL, and data pipeline development.
Hands-on experience with Big Data technologies: Spark, Hadoop, Kafka, Hive, or similar.
Experience with cloud platforms (AWS / Azure / GCP) and data services like S3, Redshift, BigQuery, Databricks, or Snowflake.
Proficiency in ETL tools, workflow schedulers (Airflow, Cloud Composer, Prefect), and CI/CD practices.
Strong understanding of data modeling, warehousing, and performance optimization.
Experience working with APIs, microservices, and distributed architectures is a plus.
Job Type: Contractual / Temporary
Contract length: 6-12 months
Pay: ₹70,000.00 - ₹90,000.00 per month
Work Location: In person