Senior Data Engineer
P707
Updated on 5/19/2023
Locations
Northbrook, IL, USA • Remote • Chicago, IL, USA • Portland, OR...
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Agile
Apache Hive
Apache Spark
Apache Kafka
Data Science
Hadoop
Business Analytics
Git
Java
Microsoft Azure
Snowflake
SQL
Tableau
Python
Power BI
Requirements
  • 4+ years proven ability of professional Data Development experience
  • 3+ years proven ability of developing with Databricks or Hadoop/HDFS
  • 3+ years of experience with PySpark/Spark
  • 3+ years of experience with SQL
  • 3+ years of experience developing with either Python, Java, or Scala
  • Full understanding of ETL concepts and Data Warehousing concepts
  • Experience with CI/CD
  • Experience with version control software
  • Strong understanding of Agile Principles (Scrum)
  • Bachelor's Degree (Computer Science, Management Information Systems, Mathematics, Business Analytics, or STEM)
Responsibilities
  • 84.51° is a retail data science, insights and media company. We help the Kroger company, consumer packaged goods companies, agencies, publishers and affiliated partners create more personalized and valuable experiences for shoppers across the path to purchase
  • Powered by cutting edge science, we leverage 1st party retail data from nearly 1 of 2 US households and 2BN+ transactions to fuel a more customer-centric journey utilizing 84.51° Insights, 84.51° Loyalty Marketing and our retail media advertising solution, Kroger Precision Marketing
  • Join us at 84.51°!
  • __________________________________________________________
  • As a Senior Data Engineer, you will have the opportunity to build solutions that ingest, transform, store, and distribute our big data to be consumed by data scientists and our products
  • Our data engineers use PySpark/Python, Databricks, Hadoop, Hive, and other data engineering technologies and visualization tools to deliver data capabilities and services to our scientists, products, and tools
  • Responsibilities Take ownership of features and drive them to completion through all phases of the entire 84.51° SDLC. This includes internal and external facing applications as well as process improvement activities:
Desired Qualifications
  • Experience with Azure
  • Experience with Databricks Delta Tables, Delta Lake, Delta Live Tables
  • Proficient with Relational Data Modeling
  • Experience with CI/CD pipelines
  • Experience with Python Library Development
  • Experience with Structured Streaming (Spark or otherwise)
  • Experience with Kafka and/or Azure Event Hub
  • Experience with Github SaaS / Github Actions
  • Experience with Snowflake
  • Exposure to BI Tooling (Tableau, Power BI, Cognos, etc.)
84.51 Degrees
Retail data science, insights, and media platform