Full-Time

Data Engineer

Posted on 3/3/2026

ShyftLabs

ShyftLabs

11-50 employees

Data-driven decision-making platform for organizations

No salary listed

Noida, Uttar Pradesh, India

In Person

Category
Data & Analytics (1)
Required Skills
Python
Airflow
BigQuery
SQL
Machine Learning
Java
ETL
Looker
Requirements
  • 3-5+ years of hands-on experience in a Data Engineering, Software Engineering, or a similar role.
  • Strong proficiency in a programming language such as Python or Java for data processing and automation.
  • Mastery of SQL for complex data manipulation, DDL/DML operations, and query optimization.
  • Proven expertise in using Google BigQuery as a data warehouse, including data modeling, performance tuning, and cost management.
  • Hands-on experience building data pipelines using the GCP ecosystem (e.g., Dataflow, Pub/Sub, Cloud Storage, Cloud Composer/Airflow).
  • Deep understanding of ETL/ELT principles and data warehousing architecture (e.g., Star Schema, Data Lakes).
  • Strong problem-solving and troubleshooting skills with a focus on building scalable, maintainable, and automated systems.
Responsibilities
  • Data Architecture & Pipeline Development: Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines using GCP services like Dataflow, Cloud Functions, Pub/Sub, and Cloud Composer.
  • Data Warehousing: Develop and manage our central data warehouse in Google BigQuery. Implement data models, schemas, and table structures optimized for performance and scalability.
  • Data Processing & Transformation: Write clean, efficient, and robust code (primarily in SQL and Python) to transform raw data into curated, analysis-ready datasets.
  • Infrastructure Optimization & Scalability: Monitor, troubleshoot, and optimize our data infrastructure for performance, reliability, and cost-effectiveness. Implement BigQuery best practices, including partitioning, clustering, and materialized views.
  • Enable Data Accessibility & BI: Build and maintain curated data models that serve as the "source of truth" for business intelligence and reporting, ensuring data is ready for consumption by BI tools like Looker.
  • Data Governance & Quality: Implement automated data quality checks, validation rules, and monitoring to ensure the accuracy and integrity of our data pipelines and warehouse.
  • Collaboration: Work closely with software engineers, data analysts, and data scientists to understand their data requirements and provide the necessary infrastructure and data products.
Desired Qualifications
  • BI Tool Integration: Experience building data models that power BI tools like Looker (knowledge of LookML is a strong plus), Tableau, or Power BI.
  • Modern Data Stack Tools: Experience with tools like dbt, Dataform, or Fivetran for data transformation and integration.
  • Infrastructure as Code (IaC): Familiarity with tools like Terraform or Deployment Manager for managing cloud infrastructure.
  • Containerization: Knowledge of Docker and Kubernetes is a plus.
  • Certifications: Google Cloud Professional Data Engineer certification is highly desirable.
  • Version Control: Proficiency with Git for code management and CI/CD pipelines.

ShyftLabs helps organizations adopt a data-first approach to decision making by designing and implementing processes that turn data into actionable insights. Its solution builds structured analytics workflows and governance, so teams access trustworthy data, follow defined steps, and act on results with clarity. Unlike tools that only show dashboards, ShyftLabs focuses on repeatable data practices and governance that speed up decisions and reduce ad hoc analysis. The goal is to help organizations stay ahead of the competition by enabling faster, more informed decisions across the business.

Company Size

11-50

Company Stage

N/A

Total Funding

N/A

Headquarters

Canada

Founded

2018

Simplify Jobs

Simplify's Take

What believers are saying

  • ShyftLabs hires Apache Druid Engineers and Data Architects in Gurugram and Toronto.
  • ShyftLabs powers 200+ experts streamlining operations for major public agencies.
  • ShyftLabs modernizes legacy systems with cloud migration for scalable citizen services.

What critics are saying

  • Databricks Lakehouse erodes consulting margins as clients build in-house pipelines.
  • Snowflake Cortex AI bypasses ShyftLabs BI with native serverless ML functions.
  • Talent exodus to AWS drains ShyftLabs engineers amid 30% higher salaries.

What makes ShyftLabs unique

  • ShyftLabs delivers privacy-first Carter platform for secure public sector AI.
  • ShyftLabs unlocked $500 million value via data and AI for retailers and health.
  • ShyftLabs builds custom low-latency pipelines embedding intelligence for real-time decisions.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Health Insurance

Hybrid Work Options

Professional Development Budget

INACTIVE