Full-Time

Data Ops Senior Data Engineer

Confirmed live in the last 24 hours

Fetch

Fetch

11-50 employees

On-demand delivery service for local products

Automotive & Transportation
Consumer Goods

Senior

Remote in USA

Category
Data Management
Operations & Logistics
Data Engineering
Data & Analytics
Required Skills
Rust
Python
Apache Spark
SQL
Apache Kafka
Java
CloudFormation
AWS
Go
Terraform
Development Operations (DevOps)
Data Analysis
Requirements
  • Self starter that can take a project from architecture to adoption.
  • Experience with Infrastructure as Code tools such as Terraform or CloudFormation. Ability to automate the deployment and management of data infrastructure.
  • Familiarity with Continuous Integration and Continuous Deployment (CI/CD) processes. Experience setting up and maintaining CI/CD pipelines for data applications.
  • Proficiency in software development lifecycle process. Release fast and improve incrementally.
  • Experience with tools and frameworks for ensuring data quality, such as data validation, anomaly detection, and monitoring. Ability to design systems to track and enforce data quality standards.
  • Proven experience in designing, building, and maintaining scalable data pipelines capable of processing terabytes of data daily using modern data processing frameworks (e.g., Apache Spark, Apache Kafka, Flink, Open Table Formats, modern OLAP databases).
  • Strong foundation in data architecture principles and the ability to evaluate emerging technologies.
  • Proficient in at least one modern programming language (Go, Python, Java, Rust) and SQL.
  • Comfortable presenting and challenging technical decisions in a peer review environment.
  • Undergraduate or graduate degree in relevant fields such as Computer science, Data Science, Business Analytics.
Responsibilities
  • Design and implement both real-time and batch data processing pipelines, leveraging technologies like Apache Kafka, Apache Flink, or managed cloud streaming services to ensure scalability and resilience.
  • Create data pipelines that efficiently process terabytes of data daily, leveraging data lakes and data warehouses within the AWS cloud. Must be proficient with technologies like Apache Spark to handle large-scale data processing.
  • Implement robust schema management practices and lay the groundwork for future data contracts. Ensure pipeline integrity by establishing and enforcing data quality checks, improving overall data reliability and consistency.
  • Develop tools to support rapid development of data products. Provide recommended patterns to support data pipeline deployments.
  • Designing, implementing, and maintaining data governance frameworks and best practices to ensure data quality, security, compliance, and accessibility across the organization.
  • Develop tools to support the rapid development of data products and establish recommended patterns for data pipeline deployments. Mentor and guide junior engineers, fostering their growth in best practices and efficient development processes.
  • Collaborate with the DevOps team to integrate data needs into DevOps tooling.
  • Champion DataOps practices within the organization, promoting a culture of collaboration, automation, and continuous improvement in data engineering processes.
  • Stay abreast of emerging technologies, tools and trends in data processing and analytics, and evaluate their potential impact and relevance to Fetch’s strategy.

Fetch Delivery Inc. specializes in on-demand delivery services for individual consumers and local businesses in Santa Fe, NM, and Boulder, CO. Customers can order a variety of products, including groceries, pharmacy items, meals, pet supplies, home goods, and more, sourced from hundreds of local businesses. Once an order is placed, Fetch's drivers pick up the items and deliver them directly to the customer's doorstep. The company charges delivery fees for each order and offers promotional discounts to attract new users. Fetch also provides local businesses with a platform to reach more customers without managing their own delivery logistics. Additionally, Fetch offers flexible earning opportunities for drivers, allowing them to work on their own schedule. The goal of Fetch Delivery is to meet the growing consumer demand for convenience and time-saving solutions in the local delivery market.

Company Stage

N/A

Total Funding

$9.2M

Headquarters

Irvine, California

Founded

2013

Simplify Jobs

Simplify's Take

What believers are saying

  • Fetch's recent $50 million funding from Morgan Stanley indicates strong financial backing and potential for rapid expansion.
  • The company's focus on local markets and partnerships with local businesses can lead to strong community ties and customer loyalty.
  • Flexible job opportunities for drivers can attract a dedicated workforce, enhancing service reliability and customer satisfaction.

What critics are saying

  • Operating in a highly competitive market with giants like DoorDash and Instacart could limit Fetch's market share and growth potential.
  • The reliance on local businesses means that any economic downturns in these areas could directly impact Fetch's operations and revenue.

What makes Fetch unique

  • Fetch Delivery focuses on hyper-local markets like Santa Fe and Boulder, providing a personalized and community-centric service that larger competitors may overlook.
  • The company partners with a wide range of local businesses, offering a diverse product selection from groceries to hardware, which sets it apart from more specialized delivery services.
  • Fetch's flexible earning opportunities for drivers create a strong community engagement and local employment, unlike gig economy giants that often face criticism for their labor practices.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Stock options

401k match

Medical, dental, & vision

Pet insurance

Education reimbursement

Flexible PTO

Parental leave

Flexible work schedule

Hybrid work environment