Senior Data Developer
Machine Learning
Confirmed live in the last 24 hours
PTW

1,001-5,000 employees

Global game development & support services
Company Overview
PTW distinguishes itself in the global games services industry through its comprehensive suite of services, including quality assurance, localization, and audio production, all enhanced by a commitment to cultural authenticity and linguistic precision. Their in-house art team, 1518 Studios, is recognized for delivering high-quality assets across various platforms and formats, while their post-launch support extends a game's lifecycle through co-development and live operations expertise. Moreover, PTW's dedication to player support is exemplified by their 24/7 assistance from teams who are not only familiar with the games they support but are also players themselves, ensuring a deep understanding of complex issues.
Data & Analytics

Company Stage

N/A

Total Funding

N/A

Founded

1994

Headquarters

Marina del Rey, California

Growth & Insights
Headcount

6 month growth

0%

1 year growth

0%

2 year growth

1%
Locations
Montreal, QC, Canada
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Microsoft Azure
Python
Airflow
Apache Spark
SQL
Apache Kafka
AWS
Apache Hive
Data Analysis
Google Cloud Platform
CategoriesNew
Data Engineering
Data Management
Data & Analytics
Requirements
  • Bachelor’s degree in computer science or related field, or equivalent work experience
  • 5+ years of relevant work experience supporting data and machine learning teams
  • Deep expertise using SQL and similar languages
  • Experience building and maintaining software and services in languages like Python or similar languages
  • Experience with data warehousing, processing, pipelines, and data quality monitoring at scale
  • Experience with a wide variety of data processing and storage systems (S3, Spark, Hive, Kafka, Metastores, etc)
  • Experience with data pipelining and orchestration tools (e.g. Airflow)
  • Experience deploying cloud services (AWS / GCP / Azure) as code (Terraform, Ansible) using container technologies (Docker / Kubernetes)
Responsibilities
  • Lead the design of warehouse data models to support data scientists and analysts
  • Ensure data quality cleanliness by building reliable, sustainable pipelines with robust quality control
  • Support infrastructure engineers to design ingestion pipelines for of a wide variety of downstream use-cases
  • Partner with service engineers to design and instrument telemetry
  • Collaborate with product engineers and data scientists to deploy and integrate services powering machine learning and AI models
  • Define MLOps and development best practices for the organization