Senior Data Ops Engineer
Updated on 11/30/2023
Business intelligence products for companies
Dun & Bradstreet seeks to create a global network of trust enabling clients to turn uncertainty into confidence, risk into opportunity, and potential into prosperity. The company is building on its world-class data and analytics—Dun & Bradstreet Data Cloud—to deliver more data and deeper insights.
Data & Analytics
Growth & Insights
6 month growth↓ -4%
1 year growth↑ 1%
2 year growth↑ 1%
Austin, TX, USA
Development Operations (DevOps)
Google Cloud Platform
Quality Assurance (QA)
QA & Testing
DevOps & Infrastructure
- Extensive experience working with GCP services, including Big Query, Dataflow, Pub/Sub, Cloud Storage, Cloud Run, Cloud Functions and related technologies is required.
- Extensive experience with SQL and relational databases, including optimization and design.
- Expertise in containerized infrastructure and CI/CD systems, including CloudBuild, Docker, Kubernetes, and GitHub Actions.
- Testable and efficient Python coding for data processing and analysis.
- Experience with Amazon Web Services (EC2, RDS, S3, Redshift, EMR, and more).
- Experience with OS level scripting (bash, sed, awk, grep, etc.).
- Experience in AdTech, web cookies, and online advertising technologies.
- Familiarity with parallelization of applications on a single machine and across a network of machines.
- Experience with version control (GIT/Github/BitBucket) and Agile Project Management tools (Clickup/Jira/Confluence).
- Experience with object-oriented programming, functional programming a plus.
- Analytic tools and ETL/ELT/data pipeline frameworks a plus.
- Collaborate with the data, platform, QA, and DevOps teams to design and construct advanced systems for processing, analyzing, searching, and visualizing vast datasets.
- Architect resilient systems and write highly fault-tolerant software to consistently deliver high-quality results.
- Take initiative to become familiar with existing application code and achieve a complete understanding of how the applications function.
- Pioneering novel methods for extracting intelligence from a wide array of unique data sources.
- Generate fresh insights for our clients, provide novel perspectives on their markets.
- Co-create and document data processing systems that are easy to maintain, fostering collaborative and supportive team environment.
- Help maintain existing systems, including troubleshooting and resolving alerts.
- Be a good collaborator with your peers. Be easy to get ahold of and attend all required meetings.
- Share ideas across teams to spread awareness and use of frameworks and tooling.
- Share a friendly, supportive, and reliable attitude with a great team that hold each other accountable.
- Analytic tools and ETL/ELT/data pipeline frameworks
- Experience with object-oriented programming, functional programming