👨🏻‍💻 postech.work

Data Engineer

Kumaran Systems • 🌐 In Person

In Person Posted 3 days ago

Job Description

As a

Senior Data Engineer

, you will be a key technical contributor, responsible for designing, building, and maintaining robust, scalable data pipelines and analytics platforms. You will work with cross-functional teams to implement data solutions that support operational needs, advanced analytics, and strategic initiatives using the latest cloud technologies. Your expertise will ensure our data infrastructure is reliable, efficient, and aligned with business objectives.

Must-Have Skills \& Qualifications:

5+ years

of hands-on experience in data engineering, with a proven track record of building and maintaining production-grade data systems.

Expertise in Microsoft Azure Data Ecosystem:

Azure Databricks:

Advanced experience in building and optimizing Spark-based data pipelines for ETL/ELT, data processing, and analytics.

Azure Data Factory:

Proficient in orchestrating and scheduling complex data workflows and integrations.

Azure Data Lake Storage (Gen2):

Hands-on experience with large-scale data storage, partitioning, and management for analytics workloads.

Azure SQL \& Synapse:

Strong skills in designing, developing, and tuning relational data warehouses and databases in the cloud.

Core Development Proficiency:

Python and/or Scala:

Strong programming skills for data processing, automation, and pipeline development.

SQL:

Expert-level knowledge for complex querying, performance tuning, and data manipulation.

REST API Integration:

Experience in designing, consuming, and integrating data from APIs.

Data Engineering Fundamentals:

Deep understanding of data modeling, ETL/ELT design patterns, data warehousing concepts, and data lakehouse architectures.

Nice-to-Have Qualifications:

Experience with Azure Function Apps for serverless compute and event-driven data processing.

Familiarity with Power BI for data visualization and understanding business intelligence consumption patterns.

Experience with Redis Cache for caching strategies and performance optimization.

Background in the Financial Services/Asset Management industry, with knowledge of relevant data domains (e.g., securities, portfolios, market data).

Experience with Azure DevOps or GitHub Actions for CI/CD pipelines and infrastructure-as-code (e.g., Terraform, Bicep).

Knowledge of workflow orchestration tools like Apache Airflow or job schedulers like AutoSys.

Exposure to data integration tools such as Informatica.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.