# Partner Solutions Architect

**Company**: Databricks
**Location**: Seoul, South Korea
**Work arrangement**: onsite
**Experience**: senior
**Job type**: full-time
**Category**: Engineering
**Industry**: Technology
**Wikidata**: https://www.wikidata.org/wiki/Q18350420

**Apply**: https://job-boards.greenhouse.io/databricks/jobs/8449860002
**Canonical**: https://yubhub.co/jobs/job_027d391b-e83

## Description

As a Partner Solutions Architect, you will work with Databricks' Consulting and System Integrator (C&SI) partners, teammates, and with the technical and sales team members who work directly with our customers.

You will develop 'technical champions' within our top C&SI Partners, providing enablement on technical matters related to the Databricks product. Working with our partners, you will help our customers to achieve tangible data-driven outcomes through the use of our Databricks Data Intelligence Platform, helping data teams complete projects and integrate our platform into their enterprise Ecosystem.

As a member of our team, you will exercise and develop expertise in those areas, using open-source projects such as Apache Spark™, MLflow, and Delta Lake; and major public cloud infrastructure and services. You will use this expertise to become a trusted advisor to C&SI partners.

The impact you will have:

- Provide partners with the level of enablement they need to assist their clients in evaluating and adopting Databricks including hands-on Apache Spark™ programming and integration with the wider cloud ecosystem

- Engage with the partner technical community by leading workshops, seminars, and meet-ups

- You will be a Big Data Analytics expert on aspects of architecture and design and will share this with our partner network

- Show expertise by producing creative technical solutions and blog posts

What we look for:

- 5+ years of pre-sales or post-sales experience working with external clients or partners across a variety of industry markets

- Understanding of customer-facing pre-sales or consulting role with a core strength in either data engineering or data science

- Experience demonstrating technical concepts, including presenting and whiteboarding

- Experience developing architectures within a public cloud (AWS, Azure, or GCP)

- Coding experience in SQL, Python, Scala, or Java

- Expertise in at least one of the following:

- Data Engineering technologies (Ex: Apache Spark™, Hadoop, Kafka)

- Data Warehousing (Ex: SQL, OLTP/OLAP/DSS)

- Data Science and Machine Learning technologies (Ex: pandas, scikit-learn, HPO)

- Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience

- Native-level fluency in Korean and professional working proficiency in English (both written and verbal)

## Skills

### Required
- Apache Spark
- MLflow
- Delta Lake
- public cloud infrastructure and services
- data engineering
- data science
- SQL
- Python
- Scala
- Java
