Key Responsibilities
Assist in building and maintaining data pipelines using ETL/ELT tools and frameworks.
Support data ingestion, transformation, and integration from multiple sources (APIs, databases, cloud storage).
Work with senior data engineers to develop scalable and reliable data solutions.
Perform basic data cleaning, validation, and profiling to ensure data quality.
Collaborate with analytics, BI, and engineering teams to support data requirements.
Monitor pipeline performance, troubleshoot failures, and assist in debugging issues.
Help maintain documentation for data workflows, schemas, and processes.
Learn and apply best practices for data engineering, cloud services, and automation.
Required Skills \& Qualifications
Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or related field.
Basic knowledge of Python, SQL, or other scripting/programming languages.
Understanding of relational databases (MySQL, PostgreSQL, SQL Server) and data modeling fundamentals.
Exposure to cloud platforms such as AWS, Azure, or GCP (coursework or training acceptable).
Familiarity with ETL processes and data pipeline concepts.
Strong analytical, problem-solving, and debugging skills.
Good communication skills and eagerness to learn in a fast-paced environment.
Job Type: Full-time
Pay: $70,000.00-$120,000.00 per year
Work Location: In person