Description
We are seeking a skilled Cloud DevOps Engineer to join our Commodities Technology team. As a Cloud DevOps Engineer, you will work closely with quants, portfolio managers, risk managers, and other engineers to develop data-intensive and multi-asset analytics for our Commodities platform.
Responsibilities:
- Collaborate with cross-functional teams to gather requirements and user feedback
- Design, build, and refactor robust software applications with clean and concise code following Agile and continuous delivery practices
- Automate system maintenance tasks, end-of-day processing jobs, data integrity checks, and bulk data loads/extracts
- Stay up-to-date with industry trends, new platforms, and tools, and develop a business case to adopt new technologies
- Develop new tools and infrastructure using Python (Flask/Fast API) or Java (Spring Boot) and relational data backend (AWS – Aurora/Redshift/Athena/S3)
- Support users and operational flows for quantitative risk, senior management, and portfolio management teams using the tools developed
Qualifications:
- Advanced degree in computer science or any other scientific field
- 3+ years of experience in CI/CD tools like TeamCity, Jenkins, Octopus Deploy, and ArgoCD
- AWS Cloud infrastructure design, implementation, and support
- Experience with multiple AWS services
- Infrastructure as Code deploying cloud infrastructure using Terraform or CloudFormation
- Knowledge of Python (Flask/FastAPI/Django)
- Demonstrated expertise in the process of containerization for applications and their subsequent orchestration within Kubernetes environments
- Experience working on at least one monitoring/observability stack (Datadog, ELK, Splunk, Loki, Grafana)
- Strong knowledge of Unix or Linux
- Strong communication skills to collaborate with various stakeholders
- Able to work independently in a fast-paced environment
- Detail-oriented, organized, demonstrating thoroughness and strong ownership of work
- Experience working in a production environment
- Some experience with relational and non-relational databases
Nice to have:
- Experience with a messaging middleware platform like Solace, Kafka, or RabbitMQ
- Experience with Snowflake and distributed processing technologies (e.g., Hadoop, Flink, Spark)
This listing is enriched and indexed by YubHub. To apply, use the employer's original posting:
https://mlp.eightfold.ai/careers/job/755955154859