Facebook pixel

Senior/Staff Analytics Engineer
Confirmed live in the last 24 hours
Locations
Remote • San Diego, CA, USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
AWS
Data Analysis
Data Science
Docker
Airflow
Postman
Snowflake
SQL
Tableau
Terraform
Kubernetes
Looker
Requirements
  • 5+ years of experience in related roles such as (data engineer, analytics engineer, business intelligence engineer, analyst)
  • Advanced SQL skills including multiple-table joins, unions, sub-queries, CTE, aggregations, temporary tables, and analytical functions
  • 3+ years developing and maintaining ETL/ELT pipelines & dimensional data modeling
  • 3+ years working with relational databases
  • Experience with Data Build Tool (dbt)
  • Experience with Snowflake or other columnar databases
  • Technical expertise regarding all things data, including: mining, modeling, transforming, cleansing, and validating
  • Ability to take vague requests and transform them into concise deliverables
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
  • Excellent communication skills, both verbal and written
  • BS in Mathematics, Economics, Computer Science, Information Management, Statistics or equivalent experience
  • Experience with workflow orchestration systems such as Airflow/Luigi/Prefect/etc
  • Experience with containerization and orchestration such as Docker/Kubernetes/ECS
  • 3+ years of experience working with AWS services such as EC2, Lambda, Kinesis, S3, RDS, ECR, EKS, etc
  • 3+ years of experience with reporting and visualization tools such as Looker, Tableau, Domo, etc
  • Experience assisting inter-organizational customers to promote a data driven culture
  • Experience creating and maintaining documentation and definitions for pipelines, models, and, core business objects
  • Experience with Snowflake RBAC and administration including data masking and retention
  • Familiarity with traversing unstructured data
  • Experience with test driven development cycles, CI / CD, and source control
  • Understanding of working with APIs and related tools such as Postman
  • Experience deploying IaC via Terraform
Responsibilities
  • Work with business stakeholders, product owners, engineers, analysts, and data scientists to design and develop high-quality data pipelines
  • Leverage our data stack (Snowflake/dbt/Looker) and collaborate with product and engineering teams to define and build data models and transformations that enable complex analysis, visualization, and data science
  • Utilize best practices to document, test, and cost-optimize our data warehouse
  • Filter, clean, and transform data from a variety of sources in order to enable analysis
  • Mentor other data team members and advocate for data driven decision making
  • Take ownership of all new and existing models and pipelines in our data warehouse
Platform Science

201-500 employees

Telematics & fleet management for trucking
Company Overview
Platform's mission is to make transportation smarter.
Company Core Values
  • One Team: We proudly welcome people of various backgrounds, abilities, and perspectives to join us in creating a better road for the future
  • Resiliency: Work with a team that thinks outside the box to come up with innovative solutions
  • Empathy: Know your feelings are valid and valued by the people on your team and within the company
  • Thinking: Feel confident sharing your ideas with others and encourage your teammates to openly express their thoughts
  • Transparency: Strive for open, honest communication with your fellow team members to establish a mutual sense of trust