👨🏻‍💻 postech.work

Data Engineer

Leni • 🌐 Remote

Remote Posted 1 day, 9 hours ago

Job Description

Job Title:

Senior Data Engineer (ETL, dbt, Airflow, AWS)

Location:

Toronto, ON or Remote (US)

Company:

Leni

About Us

We’re a fast-growing proptech startup building a modern AI data platform for the

multi-family real estate

industry. Our mission is to turn performance and market data into reliable, timely, decision-ready insights for operators, owners, and asset managers.

Role Overview

We’re hiring a Data Engineer to design, build, and

validate ETL pipelines

from multiple data sources - primarily

property management systems (PMS)

and improve the

stability, reliability, and scalability

of our data infrastructure as we grow. You’ll work closely with Data and Product to standardize ingestion, enforce data quality, and ship trustworthy datasets for analytics and downstream apps.

What You’ll Do

Design \& Build Pipelines:

Create robust ingestion and transformation pipelines using

Airflow

and

dbt-core

on

AWS

(S3, Iceberg, Redshift).

Source Integration (PMS):

Connect to and normalize data from PMS platforms (APIs, webhooks, SFTP, files); help to facilitate mapping schemas and validate data access reliability and integrity

Data Modeling:

Implement scalable

ETL

models and dimensional/semantic layers with

dbt

(staging, marts), including tests, documentation, and lineage.

Data Quality \& Validation:

Establish automated data checks, anomaly detection, and reconciliation against source totals; build backfill/retry strategies.

Reliability \& Observability:

Improve SLAs/SLOs with monitoring, alerting, and logging; reduce pipeline failures and mean time to recovery.

Performance \& Cost:

Tune queries and storage formats/partitioning; optimize compute and scheduling to balance performance, cost, and freshness.

Security \& Governance:

Help implement least-privilege access, secrets management, contribute to in-place SOC2 practices.

Collaboration:

Partner with BI developers to ensure datasets are analytics-ready and support Tableau performance

Qualifications

Must-Have

3–5 years

in data engineering or analytics engineering roles.

Strong

SQL

(window functions, CTEs, performance tuning) and

dbt

(models, tests, macros, snapshots).

Production experience with

Airflow

(DAG design, dependency management, retries, SLAs).

Hands-on with

AWS

data services (S3, IAM, EC2/ECS/Lambda, Iceberg/Athena/Redshift, Step Functions) and CI/CD (GitHub/GitLab actions).

Proven track record building reliable ETL/ELT pipelines from heterogeneous sources (APIs, SFTP, flat files).

Solid data quality mindset: testing, validation, lineage, and documentation.

Big Plus

Experience mining data from

PMS systems

(e.g., Yardi, RealPage, AppFolio, Entrata, ResMan) and understanding of multifamily data domains (rent roll, GL, unit/lease, delinquency, maintenance, traffic).

Python for data tooling (requests, pandas/pyarrow, boto3)

Observability/alerting tools, and workflow best practices (data contracts, CDC, incremental strategies).

Familiarity with semantic layer needs (Cube).

Our Stack (typical)

AWS, Airflow, dbt, Python, SQL, Git-based CI/CD, JS TypeScript, Tableau.

Why Join

Impact:

Your pipelines and models will be the backbone of how multifamily operators make daily decisions.

Growth:

Broad scope across ingestion, modeling, reliability, and governance in a high-ownership environment.

Team:

Collaborative, pragmatic, and product-minded data culture.

Process

·

Intro with HR / Screening

·

Hiring Manager Interview

·

Take-home assignment

·

Interview with CEO

·

Offer

Comp Range

·

190-220K + Bonus, Equity

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.