Skip to content

Senior Analytics Engineer

GameChanger LogoGameChanger

Salary

The target salary range for this position is between $160,000 and $200,000.

Location

Remote

GameChanger is looking to add a Senior Analytics Engineer to our growing Analytics Hub team within the Data & Analytics organization (DNA). As part of this centralized team, the Senior Analytics Engineer’s goal is to deliver extensible models and frameworks, enabling faster & deeper analytics across the company. Together with the Data Engineering and Embedded Analytics teams, DNA’s overall mission is to empower everyone at GameChanger to drive meaningful outcomes using data.

In this role, you will wrangle both 1st party and 3rd party data, own architecting our data foundation for reporting, analysis, and experimentation, and build foundational models to unlock insights for teams across the company including product, finance, operations, and engineering. Using Python, SQL, and DBT, you will optimize and transform data from warehouse tables into critical self-serve data artifacts that power impactful analytic use cases (e.g. metrics, dashboards) and empower data consumers across the company.

What You’ll Do:

  • Own architecting, optimizing, and transforming finance and product centric data models in DBT for flexible analysis by Data Analysts / Scientists and self-service by business stakeholders.
  • Design data validations to ensure data integrity throughout our pipelines, and assure the accuracy of our reporting.
  • Create transparency throughout the full data pipeline by establishing foundational processes with upstream data producers and downstream data consuming tools (BI, Reverse ETL, experimentation).
  • Advance automation efforts that help the team spend less time investigating data issues, and more time doing analyses.
  • Apply software engineering best practices like version control and continuous integration to the analytics code base.
  • Work closely with our Data Engineers, Analysts, Data Scientists, Diamond (Baseball/Softball) Sports, and Finance teams to build foundational models around new sport specific features, monthly recurring revenue, and subscription forecasting.
  • Support cross functional core metrics and data dependencies with leadership visibility.

Who You Are:

  • 5+ years of experience in the Product, Finance, or Operations space as an analytics engineer, data analyst, data engineer, or equivalent.
  • Ability to write complex SQL, ad-hoc data pipelines, and experience in using Python for data analysis.
  • Hands-on experience implementing modern data modeling strategies with DBT, including validating data, building macros, and selecting optimal materialization.
  • Proven record of proactively driving improvements to data warehouse architecture.
  • Comfort with event tracking data and product analytics tools, including familiarity with common event data analyses like funnels and user paths.
  • Experience scaling a data platform’s capabilities while balancing warehouse costs.
  • Complete understanding of dimensional modeling techniques and when to use them.
  • Comfort working in an agile, iterative environment and ability to thrive in a remote first organization.
  • Demonstrated capacity to clearly and concisely communicate complex business activities, technical requirements, and recommendations amongst cross-functional stakeholders.
  • Fluent in version control using git.

Bonus Points:

  • Direct work with the tools in our data tech stack: Airflow, Github Actions, AWS, Snowplow, Braze, Fivetran, DBT, Snowflake, BigQuery, Looker, Hex, Kubit, Statsig (or equivalents).
  • Experience with building and maintaining semantic layers and self-service technologies (e.g. metrics layers and LookML) for non-technical end-users.
  • Experience modeling data for statistical inference, machine learning or computer vision driven product features.
  • Experience optimizing query performance within a data lakehouse architecture and/or with materialized views.
  • Familiarity with creating common Software as a Service (SAAS) metrics.
  • Familiarity modeling data from low latency and real-time data pipelines.