Facebook pixel

Senior Software Engineer
Core Serving, Streaming
Posted on 3/15/2023
Locations
Remote
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Apache Spark
Apache Kafka
Data Analysis
Docker
Elasticsearch
Java
Airflow
Postgres
Redis
SQL
Terraform
Apache Beam
Apache Flink
Kubernetes
Blockchain
Datadog
Requirements
  • Build highly reliable data services to integrate with dozens of blockchains
  • Develop ETL pipelines that transform and process petabytes of structured and unstructured data in real-time
  • Design data models for optimal storage and retrieval to support sub-second latency for querying blockchain data
  • Deploy and monitor large database clusters that are performant and highly available
  • Work cross-functionally with data scientists, backend engineers, and product managers to design and implement, and new data models to support TRM's products
  • Bachelor's degree (or equivalent) in Computer Science or a related field
  • 5+ years of experience building distributed system architecture, from whiteboard to production
  • Strong programming skills in Java, and SQL or SparkSQL
  • Versatility. Experience across the entire spectrum of data engineering, including:
  • Data stores (e.g., ClickHouse, ElasticSearch, Postgres, Redis, and Neo4j)
  • Data pipeline and workflow orchestration tools (e.g., Airflow, DBT, Luigi, Azkaban, Storm)
  • Data processing technologies and streaming workflows (e.g., Spark, Kafka, Flink, Beam)
  • Deployment and monitoring infrastructure in public cloud platforms (e.g., Docker, Terraform, Kubernetes, Datadog)
  • Loading, querying, and transforming large data sets
TRM Labs

51-200 employees