Specialist Solutions Architect
Data Engineering, Public Sector
Posted on 5/13/2023
Locations
United States
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Apache Spark
AWS
Apache Kafka
Data Analysis
Google Cloud Platform
Hadoop
Jenkins
Java
Microsoft Azure
REST APIs
Scala
SQL
Python
Requirements
  • Must be a U.S. citizen
  • Eligible and willing to be processed for a U.S. government clearance
  • 5+ years experience in a customer-facing technical role with expertise in at least one of the following:
  • Software Engineer/Data Engineer: data ingestion, streaming technologies - such as Spark Streaming and Kafka, performance tuning, troubleshooting, and debugging Spark or other big data solutions
  • Data Applications Engineer: Build use cases that use data - such as risk modeling, fraud detection, customer life-time value
  • Extensive experience building data pipelines using big data technologies such as Spark/Delta or Hadoop
  • Maintain and extend production data systems to evolve with complex needs
  • Production programming experience in SQL and Python, Scala, or Java
  • Deep Specialty Expertise in at least one of the following areas:
  • Experience scaling big data workloads that are performant and cost-effective
  • Experience with Development Tools for CI/CD, Unit and Integration testing, Automation and Orchestration, REST API, BI tools and SQL Interfaces (e.g. Jenkins)
  • Experience designing data solutions on cloud infrastructure and services, such as AWS, Azure, or GCP using best practices in cloud security and networking
  • Experience implementing industry specific data analytics use cases
  • [Desired] Degree in a quantitative discipline (Computer Science, Applied Mathematics, Operations Research)
  • Ability to travel up to 30% when needed
Responsibilities
  • Provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment
  • Architect production level data pipelines, including end-to-end pipeline load performance testing and optimization
  • Become a technical expert in an area such as data lake technology, big data streaming, or big data ingestion and workflows
  • Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures
  • Provide tutorials and training to improve community adoption (including hackathons and conference presentations)
  • Contribute to the Databricks Community
Databricks

1,001-5,000 employees

Unified, open platform for enterprise data
Company Overview
Databricks is on a mission to simplify and democratize data and AI, helping data teams solve the world’s toughest problems. As the world’s first and only lakehouse platform in the cloud, Databricks combines the best of data warehouses and data lakes to offer an open and unified platform for data and AI.
Benefits
  • Extended health care including dental and vision
  • Life/AD&D and disability coverage
  • Equity awards
  • Flexible Vacation
  • Gym reimbursement
  • Annual personal development fund
  • Work headphones reimbursement
  • Employee Assistance Program (EAP)
  • Business travel accident insurance
  • Paid Parental Leave