Skip to content

Senior Analytics Engineer

Location

Remote

As a Senior Analytics Engineer, you will build systems that can model data in a clean, clear way. These systems consolidate multiple data sources and enable both internal and external stakeholders to answer questions on an ongoing basis. You may be working on a small, cross-functional team that may include other engineers, a product manager, data scientist and other roles. Success in this role requires the ability to take on ambiguous, complex problems and design & develop innovative solutions.

In this role, you will build the tools which will allow some of the largest brands in the world to better understand their customers and reimagine the shopping experience. Join us in transforming the way that brands reach their customers and empowering consumers to Live Rewarded through the power of Fetch Points!

What you’ll do at Fetch:

  • Model and analyze data utilizing SQL best practices for OLAP / OLTP query and database performance
  • Leverage Data Build Tool (DBT), Snowflake, Airflow, AWS infrastructure, CI/CD, testing, and engineering best practices to accomplish your work
  • Generate innovative approaches to datasets with millions of daily active users and terabytes of data
  • Translate business requirements for near-real-time actionable insights into data models and artifacts
  • Communicate findings clearly both verbally and in writing to a broad range of stakeholders
  • Administrative duties for Snowflake, Tableau, and DBT/Airflow infrastructure
  • Test, monitor, and report on data health and data quality
  • Lead the charge on data documentation and data discovery initiatives

In your Toolbox (Minimum Requirements):

  • 3+ years of professional experience in a technical role requiring advance knowledge of SQL
  • Understand the difference between SQL that works and SQL that performs
  • Experience with data modeling and orchestration tools
  • Experience with relational (SQL) and non-relational (NoSQL) databases
  • Experience with object data stores (e.g., Snowflake, MongoDB, S3, HDFS, Postgres, Redis, DynamoDB)
  • Understanding of ETL vs. ELT processes, data warehouses, and business intelligence tools
  • Experience clearly communicating about data with internal and external stakeholders from both technical and nontechnical backgrounds
  • Ability to thrive in a highly autonomous, matrixed organization and manage multiple, concurrent work streams

Bonus Points (Preferred Requirements):

  • Proclivity for building and experimenting with different tools and tech, and sharing your learnings with the broader organization
  • Experience developing and maintaining  DBT or Airflow in production environments
  • Experience programmatically deploying cloud resources on AWS, Azure, or GCP
  • Experience implementing data quality, data governance, or disaster recovery initiatives
  • Proficiency in at least one imperative programming language (i.e., Python)