Full-Time

Data Engineer

Particle41

Particle41

51-200 employees

Provides expert application, DevOps, ML teams

No salary listed

Remote in India

Remote

Category
Data & Analytics (1)
Required Skills
Scikit-learn
Bash
Microsoft Azure
Python
MySQL
Git
Apache Spark
SQL
Postgres
ETL
AWS
Pandas
Elasticsearch
Redis
MongoDB
Flask
OpenCV
Linux/Unix
Databricks
Google Cloud Platform
Requirements
  • Bachelor's degree in computer science, Engineering, or related field
  • Proven experience as a Data Engineer, with a minimum of 3 years of experience
  • Proficiency in Python programming language
  • Experience with database technologies such as SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB) databases
  • Strong understanding of Programming Libraries/Frameworks and technologies such as Flask, API frameworks, data warehousing/lakehouse principles, database and ORM, data analysis Databricks, Pandas, Spark, PySpark, Machine learning, OpenCV, scikit-learn
  • Utilities & Tools: logging, requests, subprocess, regex, pytest
  • ELK stack, Redis, distributed task queues
  • Strong understanding of data warehousing/lakehousing principles and concurrent/parallel processing concepts
  • Familiarity with at least one cloud data engineering stack (Azure, AWS, or GCP) and the ability to quickly learn and adapt to new ETL/ELT tools across various cloud providers
  • Familiarity with version control systems like Git and collaborative development workflows
  • Competence in working on Linux OS and creating shell scripts
  • Solid understanding of software engineering principles, design patterns, and best practices
  • Excellent problem-solving and analytical skills, with a keen attention to detail
  • Effective communication skills, both written and verbal, and the ability to collaborate in a team environment
  • Adaptability and willingness to learn new technologies and tools as needed
Responsibilities
  • Design, develop, and maintain scalable ETL pipelines to process large volumes of data from diverse sources
  • Build and optimize data storage solutions, such as data lakes and data warehouses, to ensure efficient data retrieval and processing
  • Integrate structured and unstructured data from various internal and external systems to create a unified view for analysis
  • Ensure data accuracy, consistency, and completeness through rigorous validation, cleansing, and transformation processes
  • Maintain comprehensive documentation for data processes, tools, and systems while promoting best practices for efficient workflows
  • Collaborate with product managers, and other stakeholders to gather requirements and translate them into technical solutions
  • Participate in requirement analysis sessions to understand business needs and user requirements
  • Provide technical insights and recommendations during the requirements-gathering process
  • Participate in Agile development processes, including sprint planning, daily stand-ups, and sprint reviews
  • Work closely with Agile teams to deliver software solutions on time and within scope
  • Adapt to changing priorities and requirements in a fast-paced Agile environment
  • Conduct thorough testing and debugging to ensure the reliability, security, and performance of applications
  • Write unit tests and validate the functionality of developed features and individual elements
  • Writing integration tests to ensure different elements within a given application function as intended and meet desired requirements
  • Identify and resolve software defects, code smells, and performance bottlenecks
  • Stay updated with the latest technologies and trends in full-stack development
  • Propose innovative solutions to improve the performance, security, scalability, and maintainability of applications
  • Continuously seek opportunities to optimize and refactor existing codebase for better efficiency
  • Stay up to date with cloud platforms such as AWS, Azure, or Google Cloud Platform
  • Collaborate effectively with cross-functional teams, including testers, and product managers
  • Foster a collaborative and inclusive work environment where ideas are shared and valued
Desired Qualifications
  • Experience with cloud data engineering stacks beyond basic familiarity, and ability to learn new ETL/ELT tools across cloud providers is preferred
  • Experience with Databricks and Apache Spark ecosystem beyond basic familiarity is a plus
  • Exposure to data governance, security and compliance practices in data engineering would be beneficial

Particle41 builds expert teams in application development, DevOps, and machine learning to strengthen a client’s capabilities. They assemble cross-functional teams that integrate into a client’s projects, guiding work with velocity, visibility, and vision to accelerate delivery. They differentiate themselves by providing specialized, high-caliber teams tailored to client needs and emphasizing clear communication and ongoing capability building rather than generic consulting. The goal is to amplify a client’s business by delivering dependable, expert teams that quickly advance critical development, operations, and ML initiatives.

Company Size

51-200

Company Stage

N/A

Total Funding

N/A

Headquarters

Frisco, Texas

Founded

2015

Simplify Jobs

Simplify's Take

What believers are saying

  • AI-native toolchains enable faster timelines and lower costs, capturing market share from traditional consultancies.
  • Fractional CTO demand grows as mid-market companies need technology leadership without full-time hires.
  • Data engineering and DevOps optimization services expand TAM across healthcare, fintech, and enterprise sectors.

What critics are saying

  • Open-source AI tools like Devin and GitHub Copilot commoditize code generation, eroding premium pricing.
  • Thoughtworks' 5,000+ experts and $500M+ scale undercut Particle41 in enterprise DevOps and AI consulting.
  • Revenue discrepancy between ZoomInfo ($23.7M) and Bitscale ($14.3M) signals investor skepticism and funding risk.

What makes Particle41 unique

  • AI-augmented teams pair senior engineers with AI agents for accelerated delivery and cost reduction.
  • Fractional CTO services provide C-suite aligned technology leadership for enterprise clients.
  • Legacy modernization expertise delivers 60% latency reduction and 40% cost savings via cloud migration.

Help us improve and share your feedback! Did you find this helpful?

Your Connections

People at Particle41 who can refer or advise you

Benefits

Health Insurance

401(k) Retirement Plan

Remote Work Options

Flexible Work Hours

Paid Vacation

Paid Holidays

Professional Development Budget

Conference Attendance Budget

Wellness Program

Mental Health Support

Gym Membership

Phone/Internet Stipend

Home Office Stipend

Family Planning Benefits

Fertility Treatment Support

Stock Options

Company Equity