Full-Time

Senior Data Engineer

Posted on 10/29/2025

Walden University

Walden University

No salary listed

Downers Grove, IL, USA

Hybrid

Category
Data & Analytics (1)
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field. Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows,BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data, and will be instrumental in advancing our data platform capabilities.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
  • Experience in AI/ML data pipelines and frameworks
  • Excellent organizational, prioritization and analytical abilities.
  • Have proven experience working in incremental execution through successful launches.
  • Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment.
  • Experience working in agile environment.
Responsibilities
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
  • Troubleshoot and resolve data engineering issues as they arise.
  • Develop REST APIs to expose data to other teams within the company.
  • Stay current with emerging technologies and industry trends related to big data, streaming data, and synthetic data generation
  • Mentor and guide junior data engineers.

Company Size

N/A

Company Stage

N/A

Total Funding

N/A

Headquarters

N/A

Founded

N/A

INACTIVE