👨🏻‍💻 postech.work

DATA ENGINEERS WITH AWS, PYTON, PYSPARK

Flipped.ai • 🌐 In Person • 💵 $45 - $50

In Person Posted 5 days, 14 hours ago

Job Description

Role: Data Engineer (AWS, Python, PySpark)

Locations: McLean, VA \| Dallas, TX (On-site / Face-to-Face Interview)

Type: Contract

Rate: $50/hr (W2)

Visa Status: Open to H1T, H4-EAD, J2-EAD, L2-EAD, GC-EAD, GC, and US Citizens

Job Summary

We are seeking a highly skilled Data Engineer with extensive experience in AWS, Python, and PySpark to join our data engineering team. You will be responsible for designing, developing, and maintaining scalable data pipelines and architectures to support large-scale data processing and analytics. This role requires a hands-on developer who is comfortable working in a fast-paced environment and participating in in-person collaboration.

Key Responsibilities

Pipeline Development: Design, build, and optimize robust ETL/ELT pipelines using Python and PySpark for data ingestion, transformation, and processing.

Cloud Infrastructure: Leverage AWS services (S3, Glue, EMR, Lambda, Redshift, Athena, and DynamoDB) to build and manage data storage and warehousing solutions.

Data Optimization: Tune and optimize Spark jobs and SQL queries to ensure high performance and cost-efficiency in a distributed computing environment.

Collaboration: Work closely with data architects and business stakeholders to translate complex requirements into technical data solutions.

Quality \& Governance: Implement data quality checks, validation frameworks, and security best practices to ensure data integrity and compliance.

Documentation: Maintain clear technical documentation, including data lineage, schemas, and workflow processes.

Required Skills \& Qualifications

Programming: Strong proficiency in Python (including Pandas and NumPy) and SQL.

Big Data: Deep hands-on experience with Apache Spark/PySpark for large-scale distributed data processing.

Cloud Ecosystem: Proven experience with AWS data services (specifically Glue, EMR, and Redshift).

Data Modeling: Solid understanding of data warehousing concepts, including Star and Snowflake schemas.

DevOps/Tools: Familiarity with Git, CI/CD pipelines, and workflow orchestration tools (e.g., Airflow or Step Functions).

Education: Bachelor’s degree in Computer Science, Information Technology, or a related field.

Submission Requirements

Visa Status: Open to H1T, H4-EAD, J2-EAD, L2-EAD, GC-EAD, GC, and US Citizens.

Interview Mode: Must be available for a Face-to-Face (F2F) interview at the specified location.

Work Arrangement: Candidates must be willing to work on-site in McLean, VA or Dallas, TX.

Job Type: Contract

Pay: $45.00 - $50.00 per hour

Expected hours: 9 per week

Work Location: On the road

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.