👨🏻‍💻 postech.work

Data Engineer

Tek Tron IT • 🌐 Remote

Remote Posted 11 hours, 37 minutes ago

Job Description

About Us

We are a forward-thinking, data-driven company that specializes in delivering innovative solutions to our clients. Our team is dedicated to creating a diverse and inclusive work environment where creativity and collaboration thrive. As we continue to grow, we are looking for passionate and motivated individuals to join our dynamic data engineering team. Whether you're a fresh graduate or an experienced professional, we offer a platform to expand your skills and grow in a supportive and flexible remote-first culture.

Role Overview

As a Data Engineer, you will work with cutting-edge technologies to design, develop, and maintain robust data pipelines, data architectures, and data systems. You will play a critical role in transforming raw data into actionable insights that drive business decisions. This is an exciting opportunity to work in a fully remote environment and gain exposure to diverse projects and industry-leading tools.

Responsibilities

Data Pipeline Development: Design, build, and maintain efficient, reliable, and scalable data pipelines for collecting, processing, and storing data from various sources.

Data Integration: Integrate and consolidate data from a wide range of systems (databases, APIs, cloud services, etc.) into centralized data platforms.

ETL Process: Develop and implement ETL (Extract, Transform, Load) processes to prepare data for analysis, ensuring data quality and integrity.

Data Architecture: Contribute to the design and improvement of the data architecture, optimizing data storage and retrieval to meet the needs of the business.

Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data requirements and ensure seamless data flow.

Automation: Automate data-related workflows to ensure efficient data processing and system integration.

Documentation: Write clear documentation for data pipelines, processes, and workflows to ensure maintainability and scalability.

Performance Optimization: Continuously monitor and optimize the performance of data systems and processes to ensure maximum efficiency and reliability.

Testing and Debugging: Write and execute unit tests, and troubleshoot issues in the data pipeline, ensuring high-quality code and minimal downtime.

Required Skills \& Qualifications

Education: Bachelor’s degree in Computer Science, Engineering, Data Science, or a related field (or equivalent experience).

Experience:

  • 0 to 10 years of experience in data engineering or a similar role.

  • Experience with data integration, ETL processes, and cloud data platforms.

Technical Skills:

  • Strong programming skills in languages like Python, SQL, Java, or Scala.

  • Experience with cloud platforms (AWS, GCP, Azure) and data storage solutions (e.g., Redshift, BigQuery, Snowflake).

  • Familiarity with modern data processing frameworks (e.g., Apache Spark, Apache Kafka).

  • Knowledge of database systems (SQL and NoSQL).

  • Experience with data warehousing and big data technologies.

  • Familiarity with containerization and orchestration tools like Docker and Kubernetes is a plus.

Problem-solving: Strong analytical and problem-solving skills with a focus on scalability and efficiency.

Communication: Excellent communication skills, both written and verbal, to collaborate with technical and non-technical teams.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.