# Senior Data Engineer

**Company**: Yuno
**Location**: Europe
**Work arrangement**: remote
**Experience**: senior
**Job type**: full-time
**Category**: Engineering
**Industry**: Technology

**Apply**: https://jobs.lever.co/yuno/dc30ae7b-9c0f-426f-ae77-c58d9e4f6d6d
**Canonical**: https://yubhub.co/jobs/job_a98d4ace-d27

## Description

We are looking for a Senior Data Engineer to join our high-performing data enablement team. As a Senior Data Engineer, you will play a pivotal role within the Data team that powers Yuno and its payment platform, while helping co-design and implement an architecture that scales with the product and the company.

The stack is modern: StarRocks as our primary analytical layer, Flink for processing, DBT for transformation, Airflow for orchestration and various tooling for surfacing insights.

You'll be working on things that matter and are technically interesting:

* Design and build data pipelines for large volumes of payment data that are performant, reliable, and correct , not just fast.

* Own end-to-end data flows: from ingestion and transformation through to the outputs that Finance, Product, and clients depend on.

* Drive data quality across your domain with tooling.

* Work cross-functionally with Product, Finance and enable other Engineering teams via a 'consulting' style model.

* Contribute to how the team works , code review culture, CI/CD standards, ADRs, how we handle incidents , we're building these practices now and senior engineers shape them.

* Help onboard and level up engineers around you; there's real opportunity to make an impact here.

## Skills

### Required
- Proven proactivity, technical acumen and the ability to lead initiatives and deliver projects.
- Experience in defining and evolving data engineering standards, architectural guidelines and governance, ideally within a regulated environment.
- Strong Python and SQL skills.
- Hands-on experience with Spark or Flink in production.
- DBT for data transformation.

### Nice to have
- Airflow for orchestration.
- Experience with Apache Hudi.
- Experience with financial, transactional, or payment data.
