Description
FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.
We don't have a local legal entity, so we've partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.
You can expect a solid and innovative company with a strong market presence, a dynamic, diverse, and multicultural work environment, leaders with deep market knowledge and strategic vision, and continuous learning and development.
The new data platforms team will be our centralized shared services team supporting all data platforms such as Snowflake and Astronomer. They will be responsible for the strategy and implementation of these platforms as well as best practices for the business units to follow. In this case, the position is focused on Astronomer/Ariflow.
Key Responsibilities
- Build and maintain automated data workflows and orchestrations using Apache Airflow
- Implement at least two major end-to-end data pipeline projects using Airflow
- Design and optimize complex DAGs for scalability, maintainability, and reliability
- Create reusable, parameterized, and modular Airflow components (operators, sensors, hooks) to streamline workflow development
- Ensure effective monitoring, alerting, and logging of Airflow DAGs for quick issue resolution
- Document workflows, solutions, and processes for team knowledge sharing and training
- Mentor and support other team members in Airflow usage and adoption
- Explain best practices, identify pros and cons, and communicate technical decisions to team members
- Develop reusable frameworks, leveraging reusable concepts for efficiency and scalability
- Implement and utilize reusable ecosystem components, including Python & Apache Airflow, DynamoDB, Amazon RDS
- Develop reusable frameworks to enforce data governance and data quality standards
- CI/CD pipeline development using re-usable frameworks and Jenkins
Requirements
- Between 4-6 years of experience in a similar role
- Bachelor's degree in IT, Information systems, Computer Science or a related field
- Insurance Experience (Desirable)
- Fluency in English
- Availability to work according to CST or PST time zones.
Technical Skills
- Airflow (MUST) / Astronomer (PLUS) - Advanced (5 Years)
- Python - Advanced (4-6 Years) (MUST)
- Snowflake – Intermediate (MUST)
- DBT - Entry Level (PLUS)
- AWS Glue - Entry Level (PLUS)
- DynamoDB - Intermediate
- Amazon RDS - Intermediate
- Jenkins - Intermediate
Other Critical Skills
- Work Independently
- Strategic Thinking
- Guide Others
- Documentation
- Explain best practices
- Communicate Technical Decisions
Benefits
This position comes with a competitive compensation and benefits package.
- A competitive salary and performance-based bonuses.
- Comprehensive benefits package.
- Flexible work arrangements (remote and/or office-based).
- You will also enjoy a dynamic and inclusive work culture within a globally renowned group.
- Private Health Insurance.
- Paid Time Off.
- Training & Development opportunities in partnership with renowned companies.