Build high-value ingestion pipelines powering real insights across a modern council data platform.
An exciting day-rate contract opportunity for a hands-on Data Platform Engineer to shape how data flows across a major council environment. You’ll build critical ingestion pipelines, modernise data capability and enable reporting, analytics and operational insight for teams across the organisation.
Who you will be working with:
You will be joining a large, well-established local council with a strong commitment to becoming a genuinely data-driven organisation. Their central Data and Analytics function delivers reporting, analytics and data science outcomes that directly support service delivery to the community. The team is collaborative, well-resourced and operates within a modern Azure-based data environment, working closely with finance, procurement, customer service and technology stakeholders.
Benefits to the successful applicant:
Hybrid working environment with flexible arrangements
Day rate contract with possible extension
Work on a modern Azure data stack
High-impact role where your pipelines directly support decision making
Supportive team with strong technical leadership
Exposure to diverse enterprise data sources and systems
Duties and responsibilities include:
Design, build and maintain robust data ingestion pipelines into the council’s Azure Data Platform
Create new SQL tables and ingest data from OneCouncil Finance and Procurement (SCM) systems
Engineer ingestion pipelines using Microsoft Graph API for usage data
Build ingestion flows for Genesys Call Centre data including post-call survey datasets
Extract, transform and structure data from APIs, databases and mixed cloud/on-prem sources
Develop efficient, complex SQL procedures, functions and transformations
Collaborate with analytics and business teams to align data flows with reporting and operational needs
Ensure data integrity, optimise pipeline performance and document processes clearly
We are looking for someone with:
Strong hands-on experience across Azure Data Factory, Logic Apps and Key Vaults
Advanced SQL capability, including building large, complex statements, functions and stored procedures
Demonstrated experience ingesting data from APIs, SFTP and enterprise platforms
Background working with both on-prem and cloud data sources
Ability to design and deliver scalable ETL/ELT pipelines in Azure
Strong problem-solving ability with a focus on data quality, structure and optimisation
*If you are interested in this new opportunity, please apply directly. For a confidential discussion, please contact Leon Kondel at Fuse Recruitment at lkondel@fuserecruitment.com.
If you know someone exceptional who may be open for a new challenge, refer them to us and we'll give you $500*
if we find them a new role!*
#SCR-leon-kondel-1
#ChooseFuse