Full-Time

Frontend Engineer

Onehouse

Onehouse

51-200 employees

Fully managed data lakehouse service platform

No salary listed

Bengaluru, Karnataka, India

Hybrid

Category
Software Engineering (1)
Required Skills
gRPC
React.js
TypeScript
Next.js
Firebase
Requirements
  • 3+ years of experience as a Frontend engineer with experience developing and operating frontend applications
  • Experience with React and gRPC
  • Operational excellence in monitoring/deploying/testing frontend applications
  • Great problem-solving skills, a keen eye for details
  • Quick decision making, self-driven and able to balance to feature development and tech debt
  • Clear communication skills
Responsibilities
  • You aren’t just building a UI; you are building the command center for the transactional data lake. Your work sits at the intersection of complex distributed systems and high-performance user interfaces
  • Architecting AI-Native Data Experiences: you will lead the development of generative AI interfaces that allow users to talk to their data infrastructure. Imagine building natural language interfaces that auto-generate schemas or debug complex data pipelines in real-time
  • Visualizing Massive Scale: you’ll solve the 'dense data' problem—crafting performant, intuitive visualizations for petabyte-scale data flows using React, Next.js and Typescript, ensuring that managing a global data lake feels as simple as managing a spreadsheet
  • Bridging the gRPC Gap: you will help define how we stream complex infrastructure states to the frontend, leveraging gRPC-web to create a snappy, low-latency experience that feels "local" even when managing remote cloud clusters
  • Defining a Category: as an early hire, your DNA will be in the product. You’ll influence the UX patterns of a new industry category, moving the needle from "clunky enterprise software" to "elegant cloud-native platform"
  • Getting Started: you start with a sync with your lead engineers. You might be discussing how to build new feature into product with right approach
  • Deep Work Phase: you’re in the zone with Next.js and gRPC, building out sophisticated features. You’re likely writing new shared components to ensure our UI is as robust as our backend
  • Cross-Functional Collaboration: you will work with EM/PMs to discuss upcoming features and define next milestones
  • State Management & Logic: you spend time optimising our Firebase integration for real-time notifications or wiring up gRPC services to ensure the frontend is perfectly synced with the underlying infrastructure
  • Direct Impact: you ship code that immediately improves how our customers interact with transactional data
  • Oncall: you will build alerts, logging and sophisticated tooling to make sure customer experience never degrades
Desired Qualifications
  • Experience with Firebase or equivalent DB
  • Experience working on a SaaS product

Onehouse provides a fully managed data lakehouse service that runs on open storage and supports table formats like Apache Hudi, Apache Iceberg, and Delta Lake. It continuously ingests and optimizes data with automated tasks such as clustering, compaction, masking, and encryption, plus easy change data capture for up-to-date data. Pricing is usage-based, claiming data-management cost reductions of 50% or more versus traditional cloud warehouses and ETL tools; deployment is quick with minimal engineering. The goal is to simplify and reduce costs while enabling fast access for BI, real-time analytics, and AI/ML workloads.

Company Size

51-200

Company Stage

Series B

Total Funding

$68M

Headquarters

San Francisco, California

Founded

2021

Simplify Jobs

Simplify's Take

What believers are saying

  • $35M Series B in June 2024 funds sales expansion through 2026.
  • Vector embeddings generator targets AI/ML vector management at scale.
  • LakeView free tool converts users to paid via observability.

What critics are saying

  • Databricks Delta Lake dominance erodes multi-format strategy in 12 months.
  • Snowflake Iceberg integration commoditizes managed lakehouse in 6 months.
  • Databricks acquires Onehouse for $200-500M in 12-36 months.

What makes Onehouse unique

  • Onehouse builds managed lakehouse on Apache Hudi, Iceberg, and Delta Lake.
  • Apache XTable gains Microsoft and Google support for format interoperability.
  • Compute Runtime accelerates queries 30x across cloud engines.

Help us improve and share your feedback! Did you find this helpful?

Your Connections

People at Onehouse who can refer or advise you

Benefits

Health Insurance

Dental Insurance

Vision Insurance

401(k) Retirement Plan

Company Equity

Unlimited Paid Time Off

Paid Sick Leave

Paid Holidays

Remote Work Options

Meal Benefits

Parental Leave

Growth & Insights and Company News

Headcount

6 month growth

4%

1 year growth

4%

2 year growth

4%
Newswire
Jan 17th, 2025
Onehouse Announces Compute Runtime To Accelerate Workloads Across All Leading Cloud Query Engines

Advanced runtime optimizations - delivered centrally on top of open formats - accelerate queries 2x to 30x and slash customer cloud infrastructure bills 20 to 80 percent

VentureBeat
Jan 16th, 2025
Apache Hudi Creator Onehouse Debuts Specialized Runtime Promising 30X Faster Data Lakehouse Queries

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More. As organizations store increasing volumes of information in data lakehouses, queries can potentially become slower and more costly.That is a challenge that Onehouse is looking to help solve. The data lakehouse technology vendor is a leading contributor to the open source Apache Hudi and Apache XTable data lake table formats. Today, the company is advancing its vision of a universal data lakehouse with its new Onehouse Compute Runtime (OCR), which offers the promise of queries that are accelerated up to 30X. That speed can potentially lead to dramatic cost savings of up to 80%, according to Onehouse.There are multiple open data lake table formats in use today, including Apache Hudi, Apache Iceberg and Delta Lake

Newswire
Aug 22nd, 2024
Onehouse Launches Vector Embeddings Generator for Managing Vectors at Scale on the Data Lakehouse

Onehouse is launching a vector embeddings generator to automate embeddings pipelines as a part of its managed ELT cloud service.

VentureBeat
Jun 26th, 2024
Onehouse secures $35M to advance open data lakehouse technology

Onehouse is continuing to build out open source Apache Hudi data lake technology, data lake table format interoperability with Apache XTable, alongside the company’s own Universal Data Lakehouse.

SiliconANGLE Media
Jun 26th, 2024
Onehouse Raises $35M, Launches New Products

Managed data lakehouse company Onehouse Inc. raised $35 million in a Series B funding round, bringing its total funding to $68 million. The company also launched two new products to enhance lakehouse performance and reduce cloud costs. Onehouse, which supports Apache Hudi, Apache Iceberg, and Delta Lake, focuses on openness and interoperability. CEO Vinoth Chandar highlighted the platform's ability to optimize data across Snowflake and Databricks, emphasizing its open and interoperable framework.