Anchorage Digital

Asset Data Engineer

Anchorage Digital
remote senior full-time New York City
Apply →

First indexed 17 Apr 2026

Description

Join the Asset Data team and build the streaming data infrastructure that powers Anchorage's digital asset platform. You'll design systems that ingest real-time blockchain and market data from diverse providers, transforming raw feeds into certified, trusted data products.

We're creating contract-governed supply chains that let us onboard new assets and providers quickly while maintaining the low-latency, high-availability SLOs our business depends on.

Responsibilities:

  • Build streaming data pipelines for blockchain data (onchain transactions, staking rewards, validator info) and market data (prices, trades, order books)
  • Design and implement data contracts and validation gates that enforce quality and schema compliance at ingestion points

Complexity and Impact of Work:

  • Collaborate on designing the architecture for standardized ingestion patterns that enable rapid onboarding of new blockchains and market data feeds
  • Establish redundancy and failover patterns to meet Tier 1 availability and freshness SLOs for critical data products

Organizational Knowledge:

  • Collaborate with Protocols, Trading, and Custody teams to understand their data needs and design certified data products with clear SLAs
  • Partner with Data Platform team on orchestration, storage patterns (BigLake), and metadata management (Atlan)

Communication and Influence:

  • Advocate for contract-governed data supply chains and help establish engineering standards for producer patterns across the org
  • Contribute to architectural decisions and help mature the team's practices around observability, testing, and operational excellence

Requirements:

  • 5-7+ years building streaming or high-throughput data systems: You have experience designing and operating production data pipelines that handle large volumes with low latency and high reliability
  • Solid backend engineering skills: You're proficient in Go or Python and have built services that interact with streaming infrastructure (Kafka, pub/sub, websockets, REST APIs)
  • Blockchain data familiarity: You understand blockchain concepts and are comfortable working with on-chain data (transactions, events, staking, validators) across multiple chains with different data models
  • Data engineering adjacent skills: You're comfortable with data transformation patterns, schema evolution, and working with cloud data warehouses (BigQuery) and storage systems (GCS, BigLake)
  • Operational mindset: You have experience deploying and operating services on cloud platforms (preferably GCP), with strong practices around monitoring, alerting, and incident response

Preferred Qualifications:

  • Staking data expertise: You've worked with staking rewards, validator data, or proof-of-stake blockchain infrastructure
  • Market data systems: You've built systems that ingest and process market data (prices, trades, order books) from exchanges or data vendors
  • Infrastructure as code: You have experience with Terraform, Kubernetes, and modern DevOps practices
This listing is enriched and indexed by YubHub. To apply, use the employer's original posting: https://jobs.lever.co/anchorage/82139746-fb0e-44b9-bbb6-ae078e5d251a