Databricks

Big Data Solutions Architect (Professional Services)

Databricks
onsite senior full-time Paris, France
Apply →

First indexed 18 Apr 2026

Description

As a Big Data Solutions Architect in our Professional Services team, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.

You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data.

RSAs are billable and know how to complete projects according to specification with excellent customer service.

You will report to the regional Manager/Lead.

Key responsibilities include:

  • Working on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to's and productionalizing customer use cases
  • Working with engagement managers to scope variety of professional services work with input from the customer
  • Guiding strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications
  • Consulting on architecture and design; bootstrapping or implementing customer projects which leads to a customers' successful understanding, evaluation and adoption of Databricks
  • Providing an escalated level of support for customer operational issues
  • Working with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer's needs
  • Working with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues

What we look for:

  • 6+ years experience in data engineering, data platforms & analytics
  • Strong expertise in data warehousing concepts, architecture, and migration strategies
  • Comfortable writing code in either Python, Pyspark or Scala
  • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
  • Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals
  • Familiarity with CI/CD for production deployments
  • Working knowledge of MLOps
  • Design and deployment of performant end-to-end data architectures
  • Experience with technical project delivery - managing scope and timelines
  • Documentation and white-boarding skills
  • Experience working with clients and managing conflicts
  • Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects
  • Data Science expertise is a nice-to-have
  • Travel to customers 10-20% of the time
  • Databricks Certification
This listing is enriched and indexed by YubHub. To apply, use the employer's original posting: https://job-boards.greenhouse.io/databricks/jobs/8482697002