About Us
At Dematic, we are gearing up to revolutionize our data landscape by building a cutting-edge Enterprise Data Lakehouse Platform. We are forming two pivotal platform teams that will spearhead the creation of the platform’s foundational components. These teams go beyond traditional data ingestion; they are architects of a microservices-driven platform, providing abstractions that empower other teams to seamlessly extend the platform.
Role Overview
We are seeking a dynamic and highly skilled Principal Data Engineer to lead these foundational efforts. This role demands someone who not only possesses a profound understanding of the data engineering landscape but is also at the forefront of their game. The ideal candidate will contribute significantly to platform development, leading a team of contractors while actively shaping the future of our data ecosystem.
What we offer:
• Career Development
• Competitive Compensation and Benefits
• Pay Transparency
• Global Opportunities
Learn More Here: https://www.dematic.com/en-us/about/careers/what-we-offer
Dematic provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
The base pay range for this role is estimated to be $110,500-138,500 at the time of posting. Final compensation will be determined by various factors such as work location, education, experience, knowledge, and skills.
#LI-SJ1
What We Are Looking For:
- Architect, design and develop core data platform components with a microservices architecture, abstracting platform, and infrastructure intricacies.
- Create and maintain essential data platform SDKs and libraries, adhering to industry best practices.
- Design and develop connector frameworks and modern connectors to source data from disparate application both on-prem and cloud.
- Design and optimize data storage, processing, and querying performance for large-scale datasets using industry best practices.
- Design and develop data quality frameworks and processes to ensure the accuracy and reliability of data.
- Collaborate with data scientists, analysts, and cross functional teams to design data models, database schemas and data storage solutions.
- Design and develop advanced analytics and machine learning capabilities on the data platform.
- Design and develop observability and data governance frameworks and practices.
- Stay up to date with the latest data engineering trends, technologies, and best practices.
- Drive the deployment and release cycles, ensuring a robust and scalable platform.
Tasks and Qualifications:
Requirements:
- 10+ (for senior) 15+ (for principal) of proven experience in modern cloud data engineering, data architectures, data warehousing, and software engineering.
- Expertise in architecting, designing, and building end to end data platforms in the GCP environment using BigQuery and other services while adhering to best practices guidelines such as open standards, cost, performance, time to market and minimize vendor lock.
- Solid experience building data platforms in GCP environment.
- Solid experience designing and developing modular, distributed data platform components with a microservices architecture. Strong experience with Docker, Kubernetes, APIs is needed.
- Proficiency in data engineering tools and technologies - SQL, Python, Spark, DBT, Airflow, Kafka.
- Solid experience implementing data lineage, data quality and data observability for big data workflows.
- Strong experience with modern data modeling, data architecture, and data governance principles.
- Excellent experience with DataOps principles and test automation.
- Excellent experience with observability tools - Grafana and Datadog
Nice to have:
- Experience with Data Mesh architecture.
- Experience building Semantic layers for data platforms.
- Experience building scalable IoT architectures