Software Engineer
Data, Remote
Updated on 11/15/2023
Risk detection platform
Company Overview
Reality Defender is on a mission to help companies identify deepfake audio, video and images. Reality Defender's API and web app provide real-time scanning, risk scoring, and PDF report cards.
AI & Machine Learning
Cybersecurity
B2B
Company Stage
N/A
Total Funding
$20.6M
Founded
2018
Headquarters
New York, New York
Growth & Insights
Headcount
6 month growth
↑ 28%1 year growth
↑ 68%2 year growth
↑ 285%Locations
Remote
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
AWS
Data Analysis
Data Science
Microsoft Azure
MongoDB
Pandas
Pytorch
Python
TypeScript
NoSQL
CategoriesNew
AI & Machine Learning
Software Engineering
Requirements
- We encourage candidates who may not meet all the specified requirements to still apply. We value diverse perspectives and skills, and believe that unique experiences can contribute significantly to our team. If you are passionate about the role and confident in your ability to make a meaningful impact, we welcome your application. Your enthusiasm, adaptability, and potential for growth are equally important to us. Please use your cover letter to elaborate on how your background and experience make you an ideal fit for this role!
- Required:
- BS in Computer Science and at least 3 years of work experience in software/data science
- Proficient with Python, NodeJS, Typescript, and NoSQL databases (MongoDB, DynamoDB, etc)
- Interest in data exploration, visualization, cleaning, and analytics for real-world data modeling
- Solid understanding of linear algebra, statistics and deep learning concepts
- Nice to have:
- Experience working with very large databases and deep learning APIs, including Pandas, PyTorch, PySpark, etc
- Experience working with audio, visual, and/or text datasets and models
- Experience with AWS, Google Cloud, and Azure
- Highly organized, detail-oriented, and possess a proven ability to thrive under deadline pressure
Responsibilities
- Build scalable datasets and their delivery pipelines
- Develop at-scale data extraction, cleaning, and labeling tools
- Closely interface with AI team for deep learning model training and evaluation
- Automate data augmentation, quality control and content moderation