Skip to content

Senior Analytics Engineer

Wave HQ LogoWave HQ
View Organization

We believe small businesses are at the heart of our communities, and championing them is worth fighting for. We empower small business owners to manage their finances fearlessly by offering the simplest, all-in-one financial management solution they can't live without.

Wavers are a special breed. We could write a novel on our award-winning culture and what makes us, us. However, we’d rather just show you. Take one-minute and forty-eight seconds out of your day to get a quick glimpse on why you should join us.

About the Role:

Reporting to the Senior Manager of Data Platform and Operations, as a Senior Analytics Engineer you will be building tools and infrastructure to support efforts of the analytics and data team as a whole.

We’re looking for a talented, curious self-starter who is driven to solve complex problems and can juggle multiple domains and stakeholders. This highly technical individual will collaborate with all levels of the Data team as well as the various operations teams (e.g. Risk, Finance, Compliance) to provide data solutions, improve our data models as the business scales, and advance Wave to the next stage in our transformation as a data-centric organization.

This role is for someone with proven experience in complicated product environments. Strong communication skills are a must to bridge the gap between technical and non-technical audiences across a spectrum of data maturity.

Here's how you'll make a difference:

  • You’re a builder. You’ll be responsible for the design, build and deployment of a modern Data Vault 2.0 data warehouse automated using dbt on Amazon Redshift.
  • You’ll make things better. You will collaborate with a cross functional team in planning and roll-out of our Segment Customer Data Platform.
  • Build relationships. As a subject matter expert in all things BI, you’ll have people coming to you for help. You will be helping them succeed, and your outstanding ability to communicate with people will help them do that.
  • We love our customers at Wave. Your customers are internal, and external too. You can take a look at existing structures and systems and know how to help our internal customers surface the data they need to excel in serving our external customers.
  • You’ll drive process and tool improvements to enable data-driven decisions across Wave. Your work will mean something and have an impact for the company - our team relies on insights being delivered reliably to make smarter business decisions.

You'll thrive here if:

  • You’re self-motivated and have the ability to work autonomously. No one’s going to be peering over your shoulder here. We count on you to get your work done, in ambiguous conditions, with tight deadlines, while still producing high-quality work. It’s fun, promise!
  • You are all about collaboration. You’ll be working with different teams across Wave and prioritizing competing requests with your manager and colleagues using Scrum-based Agile sprints.
  • You value personal and team development. You enjoy mentoring junior engineers in honing new skills, while helping your team to identify the most important aspects of engineering and best practices.
  • You are a stellar communicator. This means you know how to translate technical terms into non-technical language which your grandma could understand.

These will help you succeed:

  • At least 5 years of experience in Analytics Engineering (or Business Intelligence), particularly data modeling, transformation and data engineering. This is important because this is what you’ll be doing most of the time, and we need someone experienced to do it.
  • At least 3 years of experience working with cloud infrastructure, including container development with Kubernetes, Kafka data streams, IaaC with Terraform and GitOps or other infrastructure automation. We’re on AWS, but Azure and GCP experience are fine.
  • Experience using dbt to implement an automated data transformation layer in a modern data warehouse, ideally using the Data Vault 2.0 methodology running on Redshift. Working with other MPP data stores like Snowflake and Big Query is also fine.
  • Experience developing models, Explores and dashboards on Looker would be ideal, but  Tableau or Power BI is also fine.
  • Comfortable coding in Python. You’re able to read, update, and maintain code that powers our data pipelines.
  • Experience working with cloud integration tools such as AWS Glue, AWS EMR, GCP DataFlow, Azure Data Factory is a plus.
  • Familiarity with financial and US state-level regulatory requirements, and knowledge of KYC / KYB and UAR compliance processes is a welcome bonus.