Updated on 11/30/2023
Automated platform for modern, scalable TV advertising
Madhive is a leading force in modern TV advertising technology, offering a self-service platform that streamlines the ad buying process, providing advertisers with enhanced simplicity, accountability, and control. The company's robust infrastructure, which processes 260 billion ad opportunities daily, ensures precise, brand-safe audience targeting at scale, and is trusted by major content owners, creators, and distributors, including FOX and TEGNA's Premion. With a customizable, full-stack platform that can integrate with existing data and systems, Madhive is uniquely positioned to assist partners in capitalizing on the shift from linear to digital advertising.
Data & Analytics
New York, New York
Growth & Insights
6 month growth↑ 13%
1 year growth↑ 29%
2 year growth↑ 104%
Remote in USA
Data Structures & Algorithms
Development Operations (DevOps)
Google Cloud Platform
Natural Language Processing (NLP)
AI & Machine Learning
- 10+ years of experience in building internet-scale robust backend systems with a specialization of around 5 years experience in machine learning model development and deployment.
- Bachelor's or Master's degree in Computer Science, Machine Learning, Data Science, or a related field. Ph.D. is a plus.
- Experience with core ML technologies such as TensorFlow, JAX, BigQuery, Bigtable, Dataflow, Kubernetes, and TFX Serving with the ability to utilize these in our backend systems.
- Strong understanding of machine learning algorithms, deep learning, and neural networks.
- Experience with data preprocessing, feature engineering, and model evaluation techniques.
- Ability to leverage Google Cloud Platform (GCP) as our exclusive infrastructure provider, while also collaborating with a language-agnostic engineering team.
- Strong communication and collaboration skills.
- Experience mentoring junior engineers and supporting the hiring and evaluation of candidates.
- Lead the design, development, and optimization of machine learning models and algorithms for various facets of our platform, including sophisticated bidding strategies, search & data retrieval functionalities, and user experience enhancements.
- Work closely with data engineers and data scientists to gather and preprocess data, ensuring its quality and relevance for model training.
- Identify and create meaningful features from raw data to enhance model performance and accuracy.
- Develop robust evaluation metrics and methodologies to assess the performance of machine learning models and fine-tune them as needed.
- Collaborate with DevOps and software engineering teams to deploy machine learning models into production environments, ensuring scalability and reliability.
- Stay up-to-date with the latest advancements in machine learning and AI research, and apply innovative techniques to solve complex problems.
- Provide guidance and mentorship to junior machine learning engineers, helping them grow their skills and capabilities.
- Collaborate with cross-functional teams, including data scientists, software engineers, product managers, and domain experts, to understand business requirements and deliver effective solutions.
- Maintain clear and organized documentation of machine learning models, algorithms, and processes for knowledge sharing and future reference.
- Participate in an on-call rotation schedule to provide timely response and support for engineering-related issues outside of regular business hours, ensuring the continuous operation of critical systems and infrastructure.
- Ph.D. in Computer Science, Machine Learning, Data Science, or a related field.
- Experience with distributed computing frameworks such as Apache Spark or Hadoop.
- Experience with natural language processing (NLP) and computer vision.
- Experience with cloud-based machine learning services such as AWS SageMaker or Azure Machine Learning.
- Experience with containerization technologies such as Docker and Kubernetes.
- Experience with version control systems such as Git.
- Experience with agile development methodologies.
- Experience with streaming data processing frameworks such as Apache Kafka or Apache Flink.