Description
Your future team
You'll be part of our new Dynamic Pricing & Revenue Management team, working alongside a Data Scientist and a Data Analyst. Together, you will work towards one core goal: helping hosts improve occupancy and earnings through a smart, dynamic, and data-driven pricing strategy.
Our Tech Stack
- Data Storage & Querying: S3, Redshift (with decentralized data sharing), Athena, and DuckDB.
- ML & Model Serving: MLflow, SageMaker, and deployment APIs for model lifecycle management.
- Cloud & DevOps: Terraform, Docker, Jenkins, and AWS EKS (Kubernetes) for scalable, resilient systems.
- Monitoring: ELK, Grafana, Looker, OpsGenie, and in-house tools for full visibility.
- Ingestion: Kafka-based event systems and tools like Airbyte and Fivetran for smooth third-party integrations.
- Automation & AI: Extensive use of AI tools like Claude, Copilot, and Codex.
Your role in this journey
As a Data Ops Engineer – Revenue Management, you'll be the engineering backbone that enables our Data Scientists to move from experimentation to production. You bridge the gap between data science models and reliable, scalable production systems.
Responsibilities
- Support model deployment and serving: help deploy pricing and demand models into production, building and maintaining APIs and serving infrastructure.
- Build and operate production pipelines: ensure data flows reliably from source to model to output, with proper monitoring and alerting.
- Collaborate cross-functionally: work closely with Data Scientists, Analysts, and Engineering teams to turn prototypes into production-ready solutions.
- Own infrastructure and tooling: set up and maintain the environments, CI/CD pipelines, and infrastructure that the team depends on.
- Ensure operational excellence by implementing monitoring, automated testing, and observability across the team's production systems.
- Migrate and productionize POC: turn experimental code into robust, maintainable Python applications.
- Ensure data quality, consistency, and documentation across revenue management metrics and datasets.
Benefits
- Impact: Shape the future of travel with products used by millions of guests and thousands of hosts.
- Learning: Grow professionally in a culture that thrives on curiosity and feedback.
- Great People: Join a team of smart, motivated, and international colleagues who challenge and support each other.
- Technology: Work in a modern tech environment.
- Flexibility: Work a hybrid setup with 50% in-office time for collaboration, and spend up to 8 weeks a year from other inspiring locations.
- Perks on Top: Of course, we also offer travel benefits, gym discounts, and other perks to keep you energized.
Experience
- 4+ years of experience in Software Engineering, Data Engineering, DevOps, or MLOps.
- Strong hands-on skills in Python , you write clean, production-quality code.
- Experience with CI/CD, Docker, and infrastructure-as-code (e.g., Terraform).
- Familiarity with cloud platforms (AWS preferred) and deploying services in production.
- Exposure to or interest in ML model deployment (MLflow, SageMaker, or similar) is a strong plus.
- Desire to learn and use cutting-edge LLM tools and agents to improve your and the entire team's productivity.
- A proactive, hands-on mindset: you take ownership, spot problems, and drive solutions forward.
How to apply
If you're excited about this opportunity, please submit your application on our careers page!
This listing is enriched and indexed by YubHub. To apply, use the employer's original posting:
https://holidu.jobs.personio.com/job/2597559