Facebook pixel

Specialist Solutions Architect
Posted on 6/14/2022
INACTIVE
Locations
London, UK
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Apache Spark
AWS
Data Analysis
Data Science
Development Operations (DevOps)
Google Cloud Platform
Hadoop
Java
Management
Microsoft Azure
R
REST APIs
Sales
Scala
SQL
Python
NoSQL
Requirements
  • Pre-sales or post-sales experience working with external clients across a variety of industry markets
  • You will have experience in a customer-facing technical role with expertise in at least one of the following:
  • Data Engineer: query tuning, performance tuning, troubleshooting, and debugging Spark or other big data solutions
  • SQL, DWH, Lakehouse: Data modelling, BI tooling, MPP/DWH, Data governance - Build use cases that use data - such as risk modelling, fraud detection, customer life-time value
  • Cloud Architect: Design, deploy and automate cloud architecture including security, infrastructure, identity management (DevOps, CI/CD)
  • Experience with design and implementation of big data technologies such as Spark/Delta, Hadoop, NoSQL, MPP, OLTP, and OLAP
  • Maintain and extend production data systems to evolve with complex needs
  • Production programming experience in Python, R, Scala or Java
  • Deep Specialty Expertise in at least one of the following areas:
  • Experience scaling big data workloads that are performant and cost-effective
  • Experience with Development Tools for CI/CD, Unit and Integration testing, Automation and Orchestration, REST API, BI tools and SQL Interfaces
  • Experience designing data solutions on cloud infrastructure and services, such as AWS, Azure, or GCP using best practises in cloud security and networking
  • Experience with ML concepts covering Model Tracking, Model Serving and other aspects of productionizing ML pipelines in distributed data processing environments like Apache Spark, using tools like MLflow
  • Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience
  • This role can be remote, but we prefer that you are located in the job listing area and can travel up to 30% when needed
  • Nice to have: Databricks Certification
Responsibilities
  • Provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment
  • Architect production level workloads, including end-to-end pipeline load performance testing and optimisation
  • Provide technical expertise in an area such as data management, cloud platforms, data science, machine learning, or architecture
  • Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures
  • Improve community adoption (through tutorials, training, hackathons and conference presentations)
  • Contribute to the Databricks Community
Databricks

1,001-5,000 employees

Unified, open platform for enterprise data
Company Overview
Databricks is on a mission to simplify and democratize data and AI, helping data teams solve the world’s toughest problems. As the world’s first and only lakehouse platform in the cloud, Databricks combines the best of data warehouses and data lakes to offer an open and unified platform for data and AI.
Benefits
  • Extended health care including dental and vision
  • Life/AD&D and disability coverage
  • Equity awards
  • Flexible Vacation
  • Gym reimbursement
  • Annual personal development fund
  • Work headphones reimbursement
  • Employee Assistance Program (EAP)
  • Business travel accident insurance
  • Paid Parental Leave