Data Engineer
Measurement and Attribution
Confirmed live in the last 24 hours
Locations
Toronto, ON, Canada • California, USA • Remote
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
BigQuery
Data Science
Git
Airflow
Linux/Unix
Redshift
Sales
Snowflake
SQL
Python
Requirements
- At least 4 years experience building and supporting analytical data marts and warehouses built on columnar oriented RDBMS systems (e.g. Snowflake, BigQuery, Redshift, Vertica, etc.)
- Technical accomplishments working with SQL, ETL scheduling technologies (e.g. Apache Airflow, prefect, etc) and experience working in python
- An expert knowledge of data warehouse architecture and hands on experience of data modeling design, building complex and scalable ETLs
- Experience with Linux/OSX command line, version control software (git), and general software development
- Proven ability to lead cross-functional data engineering solutions that depend on the contributions of others in a variety of disciplines
- Strong written and verbal communication skills and ability to build strong relationships and influence across the organization
Responsibilities
- Be the technical owner of Square's multi touch attribution (MTA) pipelines maintaining and improving this business critical system
- Make data model and ETL code improvements to improve pipeline efficiency and data quality
- Monitor daily execution, diagnose and log issues, and fix business critical pipelines to ensure SLAs are met with our internal stakeholders
- Analyze new data sources and work with stakeholders to understand the impact of integrating new data into existing pipelines and models
- Own, coordinate, and tackle cross-functional problems across Marketing, Sales, Finance, Business Intelligence, Data Science, and more
- Develop tools, features, and reports using our data tech stack to empower self-service data access and augment intelligence