Senior Data Operations Engineer
Confirmed live in the last 24 hours
Locations
Remote • United States
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
AWS
Apache Kafka
Data Analysis
Data Science
Development Operations (DevOps)
Google Cloud Platform
C/C++/C#
Java
Airflow
Segment
Snowflake
SQL
Terraform
Python
Sentry
Go
Looker
Ansible
Chef
Datadog
Requirements
- A leader for your teammates and driver of large cross functional projects within your organization
- Familiarity or expertise using and maintaining modern data platform technologies and services like Kafka, Airflow, Snowflake, Segment, Stitch, Fivetran, dbt, Looker, etc
- Familiarity or expertise using and maintaining ML tooling and platforms like AWS Sagemaker, GCP Vertex AI, BentoML, MLFlow, Kubeflow, etc
- Experience doing infrastructure-as-code using tools like Terraform, Ansible, Chef, etc., and a pathological inclination towards automation and CI/CD
- Full lifecycle ownership up through production and experience with observability and monitoring tools like DataDog, Honeycomb, Sentry, etc
- Experience architecting and implementing data governance processes and tooling (such as data catalogs, lineage tools, role-based access control, PII handling)
- Strong coding ability in Python (preferred) or other languages like Java, C#, Golang, etc., and a solid grasp of SQL fundamentals
Responsibilities
- Data evangelist: Bring, build, and drive data culture and best practices, enabling the product and engineering org to build better, more reliable, and secure data pipeline and data-driven products and powering use cases spanning internal and customer-facing analytics, data science / ML needs, and in-app experiences
- DevEx delighter: Use tooling and automation to deliver a developer experience that enables teams to quickly and easily build out data products following mature SDLC principles
- System designer: Passion for building systems, platforms, and tools that people use. You'll use your expertise in the broader data ecosystem and the modern architectures, approaches, and emerging technologies in this space, on top of a strong foundation on the fundamentals of building distributed systems in the cloud
- Act as an owner: It may start with a proof of concept but it's not done until it's in production. Adept at moving projects forward and able to unblock projects regardless of where we are in the development lifecycle
- Do less, deliver more: Familiar with the terms YAGNI and yak shaving? Focus your efforts on high-impact initiatives that really move the needle
- Impress yourself: We hold ourselves to quality above and beyond something that “just gets it done.” Each system or line of code is an opportunity to demonstrate craftspersonship
- Collaborate without ego: Work together with teams to drive cross-team and cross-functional technical roadmaps, and willing to take on roles small or large in order to further the mission at hand
Desired Qualifications
- 4+ years of relevant data engineering, data infrastructure, DataOps / MLOps, DevOps, SRE, or general systems engineering experience (high growth startup experience is a plus)
Business growth coaching platform
Company Overview
BetterUp's mission is to help people everywhere live their lives with greater clarity, purpose, and passion. The BetterUp experience brings together world-class coaching, AI technology, and behavioral science experts to deliver change at scale — improving individual resilience, adaptability, and effectiveness.
Benefits
- Medical, dental, & vision benefits
- Flexible Time Off
- Paid parental leave
- Unlimited coaching
- Wellness programs
- Education & learning stipend
- Volunteer days
Company Core Values
- Do less, deliver more
- Extreme ownership
- Work to learn
- Bias toward action