👨🏻‍💻 postech.work

Data Engineer

PwC Acceleration Center India • 🌐 In Person

In Person Posted 11 hours, 40 minutes ago

Job Description

Opportunity with PWC for those who have experience in Teradata and datastage!!

We are seeking a highly skilled and motivated Data Engineer of Data Engineering and Visualization to lead our data management and visualization initiatives. The successful candidate will have extensive experience in data engineering, ETL processes, cloud technologies, and data visualization tools. This role will oversee a team responsible for leveraging Teradata, DataStage, SSIS, AWS, Tableau, and Spotfire to transform data into actionable insights.

Key Responsibilities:

Design, develop, and maintain robust data pipelines and ETL processes using Teradata, IBM DataStage, and SSIS.

Collaborate with cross-functional teams to gather and analyze data requirements and translate them into technical specifications.

Implement data integration solutions and ensure data accuracy, consistency, and integrity.

Develop and maintain data models and schemas in both on-premises and cloud environments, particularly with AWS.

Utilize data visualization tools like Tableau and Spotfire to create insightful reports and dashboards.

Optimize and enhance performance of data processing and storage systems.

Ensure data security and compliance with relevant regulations and best practices.

Required Qualifications:

Bachelor's degree in Computer Science, Information Systems, or a related field.

Proven experience as a Data Engineer or in a similar role.

Proficiency in Teradata, IBM DataStage, and SSIS for ETL processes.

Experience with AWS cloud services including but not limited to S3, Redshift, and Lambda.

Strong skills in data visualization using Tableau and Spotfire.

Solid understanding of SQL and relational databases.

Familiarity with data warehousing concepts and best practices.

Strong problem-solving and analytical skills.

Preferred Qualifications:

Experience working in an offshore or distributed team environment.

Knowledge of additional programming languages such as Python or R.

Experience with big data technologies like Hadoop or Spark.

Familiarity with Agile methodologies and DevOps practices.

Soft Skills:

Excellent communication and interpersonal skills.

Ability to work independently and as part of a team.

Strong organizational and time-management skills.

Note: We prefer candidates who can join at the earliest.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.