Delivery Solutions Architect
Hls
Posted on 11/9/2023
INACTIVE
Databricks

5,001-10,000 employees

Unified, open platform for enterprise data
Company Overview
Databricks is on a mission to simplify and democratize data and AI, helping data teams solve the world’s toughest problems. As the world’s first and only lakehouse platform in the cloud, Databricks combines the best of data warehouses and data lakes to offer an open and unified platform for data and AI.
Data & Analytics

Company Stage

Series I

Total Funding

$4.7B

Founded

2013

Headquarters

San Francisco, California

Growth & Insights
Headcount

6 month growth

12%

1 year growth

39%

2 year growth

108%
Locations
Remote in USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Apache Spark
AWS
Data Science
Google Cloud Platform
Microsoft Azure
Sales
Scala
SQL
Python
CategoriesNew
Software Engineering
Requirements
  • 7+ years of experience in technical project/program delivery within the domain of Data and AI
  • Programming experience in Python, SQL or Scala
  • Experience in a customer-facing pre-sales, technical architecture, customer success, or consulting role
  • Understanding of solution architecture related to distributed data systems
  • Ability to attribute business value and outcomes to specific project deliverables
  • Technical program or project management experience including account, stakeholder, and resource management accountability
  • Experience resolving complex and important escalations with senior customer executives
  • Experience conducting open-ended discovery workshops, creating strategic roadmaps, conducting business analysis, and managing delivery of complex programs/projects
  • Track record of overachievement against quota, goals, or similar objective targets
Responsibilities
  • Engage with the Solutions Architect to understand the full Use Case Demand Plan for prioritized customers
  • Lead the Post-Technical Win technical account strategy and investment plan for the majority of Databricks Use Cases within our most strategic accounts
  • Be the accountable technical leader assigned to specific Use Cases and customer(s) across multiple selling teams and internal stakeholders, creating certainty from uncertainty and driving onboarding, enablement, success, go-live, and healthy consumption of the workloads where the customer has made the decision to consume Databricks
  • Be the first contact for any technical issues or questions related to production/go live status of agreed upon Use Cases within an account, oftentimes servicing multiple use cases within the largest and most complex organizations
  • Leverage both Shared Services of User Education, Onboarding/Technical Services and Support resources, along with escalating to Level 400/500 technical experts to build the right tasks that are beyond your scope of activities or expertise
  • Create, own, and execute a point-of-view as to how key use cases can be accelerated into production, bringing EM/PM in to prepare Professional Services proposals
  • Navigate Databricks Product and Engineering teams for New Product Innovations, Private Previews, and Upgrade needs
  • Develop a mutual success plan that covers all activities of all customer-facing technical roles and teams to cover the below work streams: Main use cases moving from 'win' to production, Enablement/user growth plan, Product adoption (strategy and activities to increase adoption of Databricks' Lakehouse vision), Organic needs for current investment (e.g. Cloud Cost control, Tuning & Optimization), Executive and operational governance
  • Provide internal and external updates - KPI reporting on the status of usage and customer health, covering investment status, important risks, product adoption, and use case progression - to your Technical GM
Desired Qualifications
  • Experience with Data and AI technologies such as Apache Spark, Delta Lake, and MLflow
  • Experience in the Healthcare and Life Sciences industry
  • Experience with cloud platforms such as AWS, Azure, or GCP
  • Experience with big data technologies and distributed computing
  • Experience with data engineering and data governance
  • Experience with machine learning and data science workflows