Data Infrastructure Engineer
Posted on 8/13/2023
Wave Financial

201-500 employees

Comprehensive financial management solution for small businesses
Company Overview
Wave HQ is a leading financial services software solution, recognized for its commitment to empowering small business owners with comprehensive, user-friendly, and cost-effective financial management tools. The company's culture is deeply rooted in growth and support, fostering an environment that encourages direct challenges and collaborative problem-solving. With its award-winning, free accounting, invoicing, and US-only banking software, alongside optional paid features such as online payment processing and personalized bookkeeping services, Wave HQ offers a competitive edge in the industry.
Data & Analytics

Company Stage


Total Funding





Toronto, Canada

Growth & Insights

6 month growth


1 year growth


2 year growth

Toronto, ON, Canada
Experience Level
Desired Skills
Apache Kafka
Data Analysis
DevOps & Infrastructure
Software Engineering
  • You're self-motivated and have the ability to work autonomously. No one's going to be peering over your shoulder here. We count on you to get your work done, in ambiguous conditions, with tight deadlines, while still producing high-quality work
  • You are all about collaboration. You enjoy working with different teams across Wave. We follow Scrum practices within an agile framework
  • You value personal and team development. You enjoy mentoring junior engineers in honing new skills, while helping your team to identify the most important aspects of engineering and best practices
  • You are a stellar communicator. This means you know how to translate technical terms into non-technical language that your grandma could understand
  • Enjoy the challenge of helping us build and manage a fault-tolerant data platform that scales
  • At least 3 years of experience in data engineering, specifically in building data pipelines and data infrastructure. This is important because this is what you'll be doing most of the time, and we need someone who's done this a lot
  • At least 3 years of experience working with cloud infrastructure, including container development with Kubernetes and Docker infrastructure as code (IaC) using Terraform and GitOps or other infrastructure automation on AWS
  • Experience building messaging and stream processing capabilities using Confluent or Kafka MSK and its related components
  • Experience working with multi-stage workflows using serverless services
  • Previous experience building data lakes using Delta Lake or Apache Hudi
  • Experience performing hands-on development, leading code reviews and testing, and leveraging automated frameworks
  • Experience developing and deploying solutions leveraging CI/CD processes to orchestrate automated batch and NRT (Near Real Time) pipelines running AWS Glue and dbt data transformations
  • Experience using Python, SQL and dbt
  • Experience working with cloud integration tools such as AWS Glue or AWS EMR
  • Working knowledge of data integration tools such as FiveTran, Stitch and Census
  • You're a builder. You'll be responsible for the design, build and deployment of our data pipelines - batch, incremental and stream-based
  • You'll make things better. You will collaborate within a cross-functional team in the planning and roll-out of data infrastructure services
  • You'll build relationships. As a strong software engineer who works with data, you'll have people coming to you for technical assistance. You will be helping them succeed, and your outstanding ability to communicate with people will help them do that
  • We love our customers at Wave. Your customers are internal, and external too. You can take a look at existing structures and systems and know how to help our internal customers surface the data they need to excel in serving our external customers
  • You'll drive process and tool improvements to enable data-driven decisions across Wave. Your work will mean something and have an impact on the company - our team relies on data, analytics and ML insights being delivered reliably to make smarter business decisions
Desired Qualifications
  • Knowledge and practical experience with Data Vault 2.0 on Redshift or another data warehouse is a definite !