Overview
Join to apply for the
Data Engineer
role at
Cobre
Get AI-powered advice on this job and more exclusive features. Direct message the job poster from Cobre
What is Cobre
Cobre is Latin America’s leading instant b2b payments platform. We solve the region’s most complex money movement challenges by building advanced financial infrastructure that enables companies to move money faster, safer, and more efficiently.
We enable instant business payments—local or international, direct or via API—all from a single platform.
Built for fintechs, PSPs, banks, and finance teams that demand speed, control, and efficiency. From real-time payments to automated treasury, we turn complex financial processes into simple experiences.
Cobre is the first platform in Colombia to enable companies to pay both banked and unbanked beneficiaries within the same payment cycle and through a single interface.
We are building the enterprise payments infrastructure of Latin America!
What You Will Do
Data Pipelines: Implement, maintain, and optimize the infrastructure required to support real time, event-driven and batch ETL/ELT processes of our platform. Ensure seamless operation of all processes and develop a comprehensive monitoring solution to proactively address potential issues.
Data Warehouse: Maintain, monitor, and enhance our data model across the different stages of our medallion architecture, while also implementing and reinforcing data quality processes. Monitor the cost and usage of our data warehouse to identify suboptimal processes and queries for improvement. Advocate for and promote best practices across teams.
Data Governance: Assist in defining and implementing essential data governance policies and services on our platform to ensure secure scaling in compliance with the highest standards and regulations of the financial industry.
Technical Mastery and Oversight: Maintain an in-depth understanding of the latest trends in data models, data pipelines, and data tools.
Cross-functional Collaboration: Work closely with product, engineering, and analytics teams to ensure that the data model supports and enhances product development and customer experience.
What you need
Experience: Minimum of 2 years in data engineering, focusing on scalable data pipelines and data models. Proven ability to handle, process, and secure large data sets.
Education: Bachelor's degree in Computer Science, Engineering, or a related field.
Data Pipelines \& Data Models: Proficient in building event-driven, real-time, and batch data pipelines using Python and SQL. Skilled in designing and implementing scalable, well-structured data models within modern data warehouses.
Cloud Infrastructure: Experienced in developing and automating data pipelines with cloud-native services. Adept at maintaining and optimizing infrastructure for scalability and cost efficiency. Possession of cloud certifications is a strong advantage.
Data Management: Desired knowledge in data governance processes, including the development and implementation of information access policies, data privacy protocols, information retention strategies, and more.
Data Architectures Patterns: Desired experience designing and implementing scalable, resilient, and cost-efficient architectures for event-driven, real-time, and batch processing pipelines
Relevant Technologies: Wide variety of AWS services included but not limited to DynamoDb, ElasticSearch, MWAA, Lambda, Glue, MSK, Kinesis, SQS, SNS, Event Bridge, CloudWatch, S3. Snowflake or Data Warehouse experience with Stored Procedures, Views, Materialized Views, External Tables, Streams, Data Models, File Formats like Parquet, Iceberg. Infrastructure as Code knowledge its nice to have preferably Terraform and Terragrunt, Github and Github Actions, Python, SQL.
Background in High-Volume Data Management: Desired experience in handling, processing, and securing large sets of data, with a keen understanding of the challenges and solutions in data-intensive environments.
Collaborative Spirit: The ability to work seamlessly across different departments, fostering a collaborative environment that encourages innovation and efficiency.
Industry knowledge: Fintech, specially Payments, experience in LatAm markets is a plus.
Advanced level of english is a must
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr