Data Engineer III
Confirmed live in the last 24 hours
ZoomInfo

1,001-5,000 employees

Platform providing real-time business data and insights
Company Overview
ZoomInfo, a leading go-to-market platform, offers businesses a competitive edge by providing accurate, real-time data and insights to over 35,000 companies globally, thereby enhancing efficiency and aligning sales and marketing teams. The company stands out for its strict adherence to data privacy, boasting industry-leading GDPR and CCPA compliance and multiple data security and privacy certifications. Furthermore, ZoomInfo's all-in-one platform enables businesses to identify and engage potential customers before competitors, uniting sales and marketing teams around a single source of truth and facilitating rapid scaling through task automation across all outreach channels.
Data & Analytics

Company Stage

Series A

Total Funding

$7M

Founded

2007

Headquarters

Vancouver, Washington

Growth & Insights
Headcount

6 month growth

3%

1 year growth

5%

2 year growth

40%
Locations
Bethesda, MD, USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Airflow
Data Science
Apache Beam
Git
SQL
Java
AWS
Hadoop
Snowflake
Google Cloud Platform
CategoriesNew
Data & Analytics
Requirements
  • Degree in Computer Science, Information Systems, Data Science, or related field
  • 3-5 years of experience in data engineering, or equivalent combination of education and experience
  • Expert and Hands-On development in distributed computing environment using Hadoop, Java/Scala, Apache Beam [Dataflow] and Apache Airflow
  • Advanced SQL knowledge and understanding
  • Experience architecting solutions in collaboration with development and data science teams
  • Experience working with third party APIs for data collection
  • Ability to communicate effectively with stakeholders at all levels of the organization
  • Experience mentoring/coaching business end users and junior analysts
  • Self-motivated; able to work independently and collaborate with others
  • Expertise and Hand-On working knowledge of AWS and GCP
  • Experience with Git/Github or other version control systems
  • Familiarity with Snowflake or DataBricks is a plus
Responsibilities
  • Develop Data Pipelines using Java/Scala and Apache Beam Dataflow to move data or transform data
  • Consolidate/join datasets to create easily consumable, performant, and consistent information
  • Look for ways to improve processes and take initiative to implement them
  • Evaluate new technology and advise on our data lake ecosystem
  • Work with stakeholders including data, design, product and executive teams and assist them with data-related technical issues
  • Create and maintain documentation for business end users and other data analysts
  • Determine where in the infrastructure to house data based on the use case and data model
  • Collaborate with Data Scientists to design scalable implementations of their models
Desired Qualifications
  • Familiarity with Snowflake or DataBricks