Senior Analytics Engineer
Confirmed live in the last 24 hours
Apixio

201-500 employees

AI-Powered Technology for Value-Based Care
Company Overview
Apixio is using Artificial Intelligence to change the way healthcare is measured, care is delivered, and discoveries are made.
AI & Machine Learning
Data & Analytics

Company Stage

N/A

Total Funding

$342.9M

Founded

2009

Headquarters

San Mateo, California

Growth & Insights
Headcount

6 month growth

14%

1 year growth

22%

2 year growth

22%
Locations
Los Angeles, CA, USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Python
Airflow
Apache Spark
SQL
Apache Kafka
Tableau
AWS
Marketing
Linux/Unix
Looker
Data Analysis
CategoriesNew
Data & Analytics
Requirements
  • 5+ years Data Engineering experience
  • 3+ years hands-on experience working with large, structured/unstructured datasets using partitioned cloud storage architecture using query engines such as Spark, Delta Lake
  • 3+ years experience designing, developing, deploying and testing in Databricks
  • 3+ years of hands-on experience in Python/Pyspark/SparkSQL
  • 2+ years experience on Big data pipelines/DAG tools like Airflow, dbt is required
  • 2+ years of SQL experience, specifically to write complex, highly optimized queries across large volumes of data
  • Experience in the AWS computing environment and storage services such as s3/glacier is required
  • Experience with conceptual, logical and/or physical database designs is required
  • Good knowledge in Linux and shell scripting is highly desired
  • Past experience in healthcare data extraction, transformation and normalization is highly desired
  • Hands-on experience with Kafka or other live streaming technology is highly desired
  • Experience with Data Visualization tools like Looker / Tableau is desired
Responsibilities
  • Conduct thorough data discovery to identify and understand the organization's data sources, adhering to data quality and data governance policies
  • Collaborate with stakeholders across Apixio (Ops, Product, Marketing, G&A) and within the team to understand business requirements and translate them into data models that effectively capture data and transform them into metrics
  • Design and implement data models using Databricks Delta Lake, ensuring data integrity, consistency, and scalability
  • Develop and implement data pipelines using Databricks and dbt to collect, process, and transform data from various sources
  • Utilize dbt to automate and streamline ELT processes, ensuring data quality and consistency
  • Develop and maintain data visualizations and dashboards using tools like Looker
  • Create interactive dashboards that provide business users with self-service access to data insights
  • Communicate data insights effectively to stakeholders, translating technical concepts into clear and actionable recommendations
Desired Qualifications
  • Ability to lead and complete projects independently once given objective and direction
  • Strong communication skills to partner with data integration, engineering, DevOps, customer success, and product teams and collect technical and business requirements
  • High agency, integrity, ownership, and curiosity