Full-Time

Data Engineer

Updated on 11/16/2024

Better Debt Solutions

Better Debt Solutions

Compensation Overview

$130k - $150kAnnually

Senior

Irvine, CA, USA

Position is onsite in Irvine, CA.

Category
Data Engineering
Data & Analytics
Required Skills
Datadog
Power BI
Redshift
Python
Airflow
Data Science
SQL
Java
Kinesis
Tableau
AWS
Elasticsearch
Kibana
Looker
Data Analysis
Google Cloud Platform
Requirements
  • Bachelor's Degree in Computer Science, Data Science or a related field.
  • At least 5 years of progressive technical experience in Data engineering, including work with big data ecosystems and large-scale data migrations.
  • Advanced proficiency in Python, Java, and SQL (5+ years) for efficient ETL processes and data pipeline development.
  • Experience in big data ecosystems and cloud environments, particularly with AWS (Redshift, S3, Glue, Kinesis, Lambda).
  • Skilled in schema design, dimensional modeling, and data mart creation.
  • Proficient in Airflow for managing ETL workflows.
  • Strong skills in Tableau and Looker are a plus.
  • Experience with both relational (SQL Server) and non-relational databases (DynamoDB, Elasticsearch).
  • Knowledge of Kibana, CloudWatch, and DataDog for monitoring and issue resolution.
  • Strong understanding of data quality, cleansing, and governance for reliable ETL processes.
  • Proven ability to identify, decompose, and solve complex data challenges.
  • Excellent communicator, able to work with technical and non-technical stakeholders alike.
  • Demonstrated ability to lead projects in a fast-paced environment, showing initiative and adaptability.
  • Familiarity with additional cloud platforms, such as Google Cloud Platform (GCP).
  • Database administration experience, including tasks for security, redundancy, and high availability.
  • Advanced visualization skills with tools like Tableau, Looker, and Power BI.
Responsibilities
  • Design and maintain scalable data infrastructure and automation pipelines for ETL processes, ensuring quality and reliability
  • Develop efficient data models, workflows, and pipelines to move data into data marts within the data warehouse.
  • Partner with Account Executives, Data & Analytics teams, IT Ops Engineers, and senior stakeholders to understand business needs and develop innovative data solutions that drive success.
  • Support data processing and reporting on a cloud-based platform, leveraging AWS technologies such as Redshift, S3, AWS Glue, and Kinesis.
  • Take charge of normalizing and governing data across the organization, ensuring that all data is accurate, consistent, and aligned with best practices.
  • Lead efforts to normalize and govern data, implementing standards and protocols to ensure data accuracy.
  • Be the go-to problem solver, identifying and resolving critical issues in production with swift troubleshooting to keep things running smoothly.
  • Address mission-critical issues in production by providing troubleshooting and resolving system challenges.
  • Create and optimize custom queries for Tableau and Looker, making sure teams have the data they need to make informed, data-driven decisions.
  • Build and maintain Python-based ETL workflows with tools such as Airflow, streamlining data ingestion, transformation, and exporting to keep everything running at peak efficiency.
  • Spearhead the migration from legacy systems to a modern cloud-based data lake, making strategic architectural decisions that ensure future scalability and long-term success.
Better Debt Solutions

Better Debt Solutions

View

Company Stage

N/A

Total Funding

N/A

Headquarters

N/A

Founded

N/A