Zenith Infotech (S) Pte Ltd. was started in 1997, primarily with the vision of offering state-of-the-art IT Professionals and solutions to various organizations and thereby helping them increase their productivity and competitiveness. From deployment of one person to formation of whole IT teams, Zenith Infotech has helped clients with their staff augmentation needs. Zenith offers opportunity to be engaged in long term projects with large IT savvy companies, Consulting organizations, System Integrators, Government, and MNCs.
EA Licence No: 20S0237
Roles and Responsibilities:
Work across workstreams to support data requirements including reports and dashboards
Analyze and perform data profiling to understand data patterns and discrepancies following Data Quality and Data Management processes
Understand and follow best practices to design and develop the E2E Data Pipeline: data transformation, ingestion, processing, and surfacing of data for large-scale applications
Develop data pipeline automation using Azure, AWS data platform and technologies stack, Databricks, Data Factory
Understand business requirements to translate them into technical requirements that the system analysts and other technical team members can drive into the project design and delivery
Analyze source data and perform data ingestion in both batch and real-time patterns via various methods; for example, file transfer, API, Data Streaming using Kafka and Spark Streaming
Analyze and understand data processing and standardization requirements, develop ETL using Spark processing to transform data
Understand data/reports and dashboards requirements, develop data export, data API, or data visualization using Power BI, Tableau, or other visualization tools
Required Skills:
We are looking for experience and qualifications in the following:
Bachelor’s degree in Computer Science, Computer Engineer, IT, or related fields
Minimum of 4 years’ experience in Data Engineering fields
Data Engineering skills: Python, SQL, Spark, Cloud Architect, Data \& Solution Architect, API, Databricks, Azure, AWS
Data Visualization skills: Power BI (or other visualization tools), DAX programming, API, Data Model, SQL, Story Telling and wireframe design
Business Analyst skills: business knowledge, data profiling, basic data model design, data analysis, requirement analysis, SQL programing
Basic knowledge in Data Lake/Data Warehousing/ Big data tools, Apache Spark, RDBMS and NoSQL, Knowledge Graph
Only shortlisted applicants will be contacted. By submitting your application, you acknowledge and agree that your personal data will be collected, used, and retained in accordance with our Privacy Policy This information will be used solely for recruitment and employment purposes.