👨🏻‍💻 postech.work

Data Engineer

NTT DATA Europe & Latam • 🌐 In Person

In Person Posted 3 days, 11 hours ago

Job Description

Who We Are

Our client is a leading payment system company specializing in designing and building real-time, account-based payment infrastructures, applications, and services. They are committed to innovation and excellence, constantly driving the evolution of payment solutions in a dynamic and growing industry.

We are seeking a highly skilled Data Engineer to enhance our data and analytics environments. The successful candidate will play a pivotal role in designing, developing, and managing data solutions that leverage cloud technologies and big data frameworks. This role will involve creating efficient data pipelines, optimizing data storage solutions, and implementing robust data processing workflows to ensure high-quality data availability for analytics and business intelligence.

What You'll Be Doing

Create and manage ETL processes using Azure Data Factory to automate data flow between various systems and services

Implement data storage solutions using Azure Data Lake and Cosmos DB, ensuring high availability and scalability

Design and optimize data processes in Databricks, implementing Spark jobs that utilize Delta Lake for efficient data handling and versioning

Build and maintain analytics models that can harness the capabilities of Delta Lake for better data quality and performance

Develop scripts in Python and PowerShell to automate operational tasks, deployments, and monitoring of data workflows

Collaborate with development teams to integrate CI/CD practices using Azure DevOps and Git for version control

Manage and optimize MS SQL Server databases, writing efficient T-SQL queries for data manipulation and reporting

Ensure data integrity and security through proper database management practices

Implement logging and monitoring solutions to proactively address any issues impacting data applications and workflows

Conduct root cause analysis on data-related problems and develop solutions to mitigate future occurrences

Collaborate with business intelligence teams to create and share reports using Power BI

Provide insights and recommendations based on data analysis to guide business decisions

What You'll Bring Along

Bachelor’s degree in Computer Science, Information Technology, or a related field is preferred

Minimum of 3-5 years of experience in a similar role

Proficiency in Python for data processing and automation

Experience with PowerShell scripting and Azure Data Factory

Strong understanding of MS SQL Server and T-SQL for database scripting

Familiarity with Databricks, Delta Lake, Azure Data Lake, and Cosmos DB

Experience with version control using Git and CI/CD frameworks in Azure DevOps

Solid problem-solving skills and ability to troubleshoot complex data issues

Familiarity with DevOps methodologies including Scrum and Agile practices for software development

Experience creating dashboards and reports using Power BI and Knowledge of data governance and data security best practices.

Excellent command of both spoken and written English.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.