Skip to content

Analytics Engineer - Contract

Wave HQ LogoWave HQ
View Organization

Reporting to the Senior Manager of Data Platform and Operations, as an Analytics Engineer you will be building tools and infrastructure to support efforts of the analytics and data team as a whole.

We’re looking for a talented, curious self-starter who is driven to solve complex problems and can juggle multiple domains and stakeholders. This highly technical individual will collaborate with all levels of the Data team as well as the various operations teams (e.g. Risk, Finance, Compliance) to provide data solutions, improve our data models as the business scales, and advance Wave to the next stage in our transformation as a data-centric organization.

This role is for someone with proven experience in complicated product environments. Strong communication skills are a must to bridge the gap between technical and non-technical audiences across a spectrum of data maturity.

Here's how you'll make a difference:

  • You’re a builder. You’ll be contributing to the design, build and deployment of a modern Data Vault 2.0 data warehouse automated using dbt on Amazon Redshift.
  • We love our customers at Wave. Your customers are internal, and external too. You can take a look at existing structures and systems and know how to help our internal customers surface the data they need to excel in serving our external customers.
  • You’ll drive process and tool improvements to enable data-driven decisions across Wave. Your work will mean something and have an impact for the company - our team relies on insights being delivered reliably to make smarter business decisions.

These will help you succeed:

  • At least 3 years of experience in Analytics Engineering (or Business Intelligence), particularly data modeling, transformation and data engineering. This is important because this is what you’ll be doing most of the time, and we need someone experienced to do it.
  • Must be proficient in dbt.  dbt is our transformation tool for SQL models.
  • At least 2 years of experience working with cloud infrastructure, including container development with Kubernetes, GitOps or other infrastructure automation. We work on AWS.
  • Experience using dbt to implement an automated data transformation layer in a modern data warehouse, (ideally using the Data Vault 2.0 methodology) running on Redshift.
  • Comfortable coding in Python. You’re able to read, update, and maintain code that powers our data pipelines.

You'll thrive here if:

  • You’re self-motivated and have the ability to work autonomously. We count on you to get your work done, in ambiguous conditions, with tight deadlines, while still producing high-quality work. It’s fun, promise!
  • You are all about collaboration. You’ll be working with different teams across Wave and prioritizing competing requests with your manager and colleagues using Scrum-based Agile sprints.
  • You value personal and team development. You enjoy helping your team to identify the most important aspects of engineering and best practices.
  • You are a stellar communicator. This means you know how to translate technical terms into non-technical language which your grandma could understand.