Midlevel Backend Engineer
Updated on 2/1/2024
Nightfall

51-200 employees

Cloud data loss prevention for AI tools and apps
Company Overview
Nightfall AI stands out as a leading company in cloud data loss prevention, offering a comprehensive suite of tools that protect sensitive data across various applications, including generative AI tools, SaaS apps, and custom apps. The company's culture fosters a strong data security hygiene, providing custom security notifications and coaching to train employees on best practice security policies. Nightfall's competitive advantage lies in its high-accuracy AI detection, automated remediation of sensitive data, and industry-leading secrets detection, which collectively ensure robust protection of company and customer information, while minimizing false positives and manual security tasks.
AI & Machine Learning
Data & Analytics
B2B

Company Stage

Series B

Total Funding

$60.3M

Founded

2018

Headquarters

San Francisco, California

Growth & Insights
Headcount

6 month growth

-2%

1 year growth

0%

2 year growth

50%
Locations
San Francisco, CA, USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Kubernetes
Python
React.js
MySQL
NoSQL
Node.js
SQL
Apache Kafka
Java
Docker
AWS
Terraform
Redis
Apache Hive
Hadoop
Data Analysis
Cassandra
CategoriesNew
DevOps & Infrastructure
Software Engineering
Requirements
  • Expertise in one or more systems/high-level programming language (e.g. Python, Go, Java, C++) and the eagerness to learn more.
  • Experience running scalable (thousands of RPS) and reliable (three 9’s) systems.
  • Experience with developing complex software systems scaling to substantial data volumes or millions of users with production quality deployment, monitoring and reliability.
  • Experience with large-scale distributed storage and database systems (SQL or NoSQL, e.g. MySQL, Cassandra)
  • Ability to decompose complex business problems and lead a team in solving them
  • Data Processing - experience with building and maintaining large scale and/or real-time complex data processing pipelines using Kafka, Hadoop, Hive, Storm, or Zookeeper
  • 3+ years of experience
Responsibilities
  • Building highly-available and secure authentication and API services
  • Maintaining and evolving mission-critical internal databases and services
  • Optimizing and operating high volume auto-scaling streaming data services
  • Instrumenting streaming data services for visibility into utilization per customer
  • Writing and maintaining documentation about internal and public services
Desired Qualifications
  • Experience with Go, Node.js, React, Python, Cassandra, Redis, Terraform, Docker, Kubernetes, AWS, Kafka, Envoy
  • Experience in cybersecurity or data protection
  • Experience with machine learning and data classification