FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.
We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.
Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.
What to expect on your journey with us:
A solid and innovative company with a strong market presence
A dynamic, diverse, and multicultural work environment
Leaders with deep market knowledge and strategic vision
Continuous learning and development
Team Function
The new data platforms team will be our centralized shared services team supporting all data platforms such as Snowflake and Astronomer. They will be responsible for the strategy and implementation of these platforms as well as best practices for the business units to follow. In this case, the position is focused on Astronomer/Ariflow.
Key Responsibilities
Airflow-Specific Responsibilities Build and maintain automated data workflows and orchestrations using Apache Airflow Build automation processes using Copilot to generate Airflow DAGs
Implement at least two major end-to-end data pipeline projects using Airflow Design and optimize complex DAGs for scalability, maintainability, and reliability
Create reusable, parameterized, and modular Airflow components (operators, sensors, hooks) to streamline workflow development Ensure effective monitoring, alerting, and logging of Airflow DAGs for quick issue resolution Document workflows, solutions, and processes for team knowledge sharing and training Mentor and support other team members in Airflow usage and adoption
Explain best practices, identify pros and cons, and communicate technical decisions to team members Re usable frameworks : Develop reusable frameworks, leveraging reusable concepts for efficiency and scalability Implement and utilize reusable ecosystem components, including Python \& Apache Airflow, DynamoDB, Amazon RDS
Develop reusable frameworks to enforce data governance and data quality standards CI/CD : CI/CD pipeline development using re-usable frameworks and Jenkins
Requirements
Between 4-6 years of experience in a similar role
Bachelor's degree in IT, Information systems, Computer Science or a related field
Insurance Experience (Desirable)
Fluency in English
Availability to work according to CST or PST time zones.
Technical Skills
Airflow (MUST) / Astronomer (PLUS) - Advanced (5 Years)
Python - Advanced (4-6 Years) (MUST)
Snowflake – Intermediate (MUST)
DBT - Entry Level (PLUS)
AWS Glue - Entry Level (PLUS)
DynamoDB - Intermediate
Amazon RDS - Intermediate
Jenkins - Intermediate
Other Critical Skills
Work Independently
Strategic Thinking
Guide Others
Documentation
Explain best practices
Communicate Technical Decisions
Benefits
This position comes with a competitive compensation and benefits package.
A competitive salary and performance-based bonuses.
Comprehensive benefits package.
Flexible work arrangements (remote and/or office-based).
You will also enjoy a dynamic and inclusive work culture within a globally renowned group.
Private Health Insurance.
Paid Time Off.
Training \& Development opportunities in partnership with renowned companies.