👨🏻‍💻 postech.work

Internship - Data Engineer

DayOne • 🌐 In Person

In Person Posted 10 hours, 27 minutes ago

Job Description

Job Title: Data Engineer Intern

Location: Singapore

About Us

At DayOne Data Centers, we are redefining the future of data center development through advanced technology and innovative data solutions. We focus on building scalable, reliable, and secure infrastructures that power the digital world. Our team is at the forefront of data engineering, creating cutting-edge data pipelines, automation solutions, and seamless integrations. Join us and play a pivotal role in the evolution of data centers globally.

What You’ll Be Doing:

Your responsibilities will include:

Assisting in designing and developing scalable ETL pipelines to process large volumes of data efficiently, integrating data from systems like Autodesk Construction Cloud (ACC), SAP, and other software.

Collaborating with the team to integrate various data sources into Dataverse, automating data flows for seamless analysis and reporting.

Building and maintaining RESTful APIs to streamline data exchange between internal and external systems.

Analyzing and troubleshooting data pipeline performance to ensure high reliability and scalability.

Supporting data integration efforts to ensure the timely availability of clean, accurate, and consistent data across platforms.

Participating in code reviews and ensuring the implementation of best practices in data engineering.

Qualifications:

Currently pursuing a degree in Computer Science, Data Engineering, Information Technology, or a related field.

Proficiency in Python, with hands-on experience in building ETL pipelines.

Familiarity with API design, development, and integration (RESTful APIs), especially for systems like Autodesk Construction Cloud (ACC) and SAP.

Understanding of Dataverse as a data storage solution and best practices for handling large datasets.

Strong problem-solving skills with a passion for building scalable data solutions.

Ability to work collaboratively in a team environment and communicate technical concepts effectively.

Bonus Points:

Experience with cloud-based platforms like AWS, Azure, or Google Cloud.

Knowledge of containerization technologies such as Docker.

Exposure to data processing frameworks such as Apache Spark or Apache Kafka.

Familiarity with CI/CD processes for data engineering tasks.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.