About the Role
We’re a focused team of \~30 building in-house quantitative research and trading infrastructure for the crypto and DeFi markets.
As a
Quant Data Engineer
, you will design and maintain high-reliability data systems that power our research, backtesting, and live trading pipelines. You will work closely with quantitative researchers, developers, and strategy leads to ensure data accuracy, scalability, and accessibility across multiple exchanges and protocols.
Key Responsibilities
Design and maintain
data pipelines
for high-frequency and historical crypto market data (spot, perpetuals, on-chain metrics, funding rates, etc.).
Build and optimize
rate-limit aware ingestion systems
(REST, WebSocket, on-chain indexers) ensuring stability and fault tolerance.
Architect scalable time-series data storage (TimescaleDB, ClickHouse, Parquet, or S3) with efficient schema design for quantitative analysis
Develop tools and APIs for researchers to access standardized, clean datasets.
Implement
data quality validation
, reconciliation, and lineage tracking to ensure accuracy.
Work with engineers to deploy and maintain scalable ETL pipelines (Airflow, AWS Lambda, etc.)
Collaborate with quants to optimize data query performance for backtesting and feature generation.
Monitor, troubleshoot, and continuously improve system reliability, latency, scalability, and cost efficiency.
Qualifications
Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or related fields.
2–7 years of experience in data engineering, quantitative infrastructure, or backend systems development
Experience with
streaming data ingestion
(Kafka, WebSocket, SQS, Redis Streams, etc.) and distributed processing.
Hands-on experience with
AWS
or equivalent cloud platform (Lambda, ECS, S3, etc.).
Experience with
AWS data stack
—
Redshift, Athena, Kinesis, and Firehose
— is a strong plus.
Understanding of
crypto market structures
(spot, perpetuals, funding rates, on-chain data) or willingness to learn quickly.
Knowledge of
CI/CD,
containerization (Docker, K8s), and version control (Git).
Strong sense of ownership, reliability, and attention to detail.
Nice-to-Have
Experience in
quantitative trading, DeFi data, or market-microstructure analysis
.
Familiarity with
Elasticsearch
,
DuckDB
, or
PySpark
for analytical workloads.
Exposure to
machine learning pipelines
for signal research.
Understanding of
Solana / Arbitrum / EVM
ecosystems or on-chain data indexing.