Kargo creates breakthrough cross-screen ad experiences for the world’s leading brands and publishers. Everyday, our 600+ employees bring the power of their creativity and diversity to radically raising the bar on what mobile, CTV, AI, social, and eCommerce can do to wow consumers and build businesses. Now 20 years strong, Kargo has offices in NYC, Chicago, Austin, LA, Dallas, Sydney, Auckland, London and Waterford, Ireland. Humble brag: In 2024, Kargo was recognized as a Best Place to Work by Ad Age and Built In.
Who We Hire
Success takes all kinds. Diversity describes our workforce. Inclusion defines our culture. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, marital status, age, national origin, protected veteran status, disability or other legally protected status. Individuals with disabilities are provided reasonable accommodation to participate in the job application process, perform essential job functions, and receive other benefits and privileges of employment.
Title: Lead Data Engineer
Job Type: Full-time; 3 Days In Office
Job Location: New York, NY
Salary Range: $110,000.00 - $170,000.00 (On Target Earning)
The Opportunity
At Kargo, we are at a pivotal juncture. As we navigate the international expansion landscape and protect ourselves from AdTech’s over-reliance on cookies, we are intensifying our focus on audience and contextual targeting solutions.We seek a proactive Lead Data Engineer who thrives in transforming ambiguity into clarity. Your primary role will be to architect, design, and optimize the data structures and processes integral to enabling fast and accurate targeting. While maintaining a hands-on approach, you will also lead and mentor a dedicated team of data engineers, ensuring the delivery of high-quality, scalable solutions.
The Daily To-Do
- Oversee a team of data engineers, ensuring adherence to high standards, project delivery, and continuous skill development
- Engage in deep-dive sessions with a diverse range of technical and non-technical stakeholders, translating business needs into actionable technical requirements in Jira/Confluence
- Collaborate with data vendors and our data partnership team to evaluate and onboard new datasets that will enhance our targeting abilities. You and your team will be responsible for building and maintaining scalable and cost-efficient data ingestion pipelines (Python, Airflow, Snowflake, AWS, Docker, Kubernetes)
- Spearhead the architecture, design, and enhancement of our targeting solutions and workflows, ensuring quality, efficiency, and transparency of the processes
- Create comprehensive documentation detailing system architecture, interdependencies, data flow diagrams, and best practices for future reference and onboarding purposes
- Monitor data storage and processing costs, seeking ways to optimize without compromising on quality
Qualifications
- Proven expertise in owning, designing, and maintaining large-scale, interdependent systems
- Experience in planning deployment and leading a team to complete development on such systems
- Mastery of Python and experience with building complex pipelines in Airflow
- Proficiency in crafting SQL queries for Snowflake, with an understanding of cost and performance nuances
- Experience with Docker/Kubernetes and the AWS ecosystem
- Familiarity interacting with third-party APIs within ETL pipelines
- Familiar with Agile methodologies and have worked with Jira/Confluence
- Comfortable communicating and presenting complex topics and plans to people in the organization, regardless of their technical skill level
- Comfortable with requirements gathering, cross-team design brainstorming sessions, and ensuring consensus among stakeholders
- You have an innate drive to take initiative and ownership of projects, transform uncertainty into actionable strategies, and a deep-seated pride in delivering meticulous, high-quality work
Follow Our Lead
- Big Picture: kargo.com
- The Latest: Instagram (@kargomobile) and LinkedIn (Kargo)