Data Engineer - sports analytics/betting
Salary £40k-£55k (plus substantial bonus on top)
Location: London or Leeds (very relaxed about hybrid/remote working)
Company description
We are a proprietary sports pricing and product provider, specializing in the development of intricate, simulation-driven pricing and risk systems that empower leading sports brands. As pioneers in player-level, play-by-play simulations and forecasting, we deliver the group’s most advanced pricing and risk capabilities - particularly focused on the US market.
Job Description
The purpose of the data engineer is to design, build, and optimize data models and analytics workflows that enable accurate, timely, and actionable insights across the business. This role bridges the gap between data engineering and analytics, ensuring that data is transformed into well-structured, reliable datasets for reporting, visualization, and advanced analytics. The role holder will focus on creating scalable, maintainable solutions that empower stakeholders to make data-driven decisions efficiently.
The key responsibilities will include:
Design and implement data models optimized for analytics and reporting use cases.
Develop and maintain ELT/ETL pipelines to transform raw data into curated datasets.
Collaborate with analysts and data scientists to understand requirements and deliver high-quality data products.
Optimize SQL queries and workflows for performance and scalability.
Ensure data quality, consistency, and governance across all analytics layers.
Implement best practices for documentation, testing, and reproducibility in analytics workflows.
Work with cloud-based tools and services (e.g., AWS S3, Athena, ECS, CloudFormation, Lambdas, CloudWatch) to support analytics infrastructure.
Contribute to the development of dashboards and self-service analytics tools.
Qualifications
Essential:
Competent in SQL and Python for data transformation and analytics.
Understanding of data modelling concepts (e.g., star schema, dimensional modelling).
Experience (1+ year) working with relational databases and designing optimized schemas.
Ability to debug and optimize slow queries and inefficient workflows.
Familiarity with cloud-based data platforms and services (AWS preferred).
Excellent communication skills for collaborating with technical and non-technical stakeholders.
Strong problem-solving and analytical mindset.
Desirable:
Experience with BI tools (e.g., Power BI, Plotly/Dash) for visualization.
Experience in frontend development – React/Javascript
Experience in structuring APIs – Django, Flask, FastAPI
Familiarity with distributed systems (e.g., Spark, Kafka) for large-scale analytics.
Knowledge of testing practices (e.g., TDD) in data workflows.
Passion for clean, well-documented systems and reproducibility.
Side projects demonstrating end-to-end analytics solution design.