Skip to content

Analytics Engineer

Fetch LogoFetch
View Organization

Scope of Responsibilities:

  • Model and analyze data utilizing SQL best practices for OLAP / OLTP query and database performance
  • Leverage Data Build Tool (DBT), Snowflake, Airflow, AWS infrastructure, CI/CD, testing, and engineering best practices to accomplish your work
  • Generate innovative approaches to datasets with millions of daily active users and terabytes of data
  • Translate business requirements for near-real-time actionable insights into data models and artifacts
  • Communicate findings clearly both verbally and in writing to a broad range of stakeholders
  • Administrative duties for Snowflake, Tableau, and DBT/Airflow infrastructure
  • Test, monitor, and report on data health and data quality
  • Lead the charge on data documentation and data discovery initiatives

The ideal candidate:

  • Are proficient in SQL and understand the difference between SQL that works and SQL that performs
  • Have worked with data modeling and orchestration tools
  • Have experience with relational (SQL), non-relational (NoSQL), and/or object data stores (e.g., Snowflake, MongoDB, S3, HDFS, Postgres, Redis, DynamoDB)
  • Have a solid understanding of ETL vs. ELT processes, data warehouses, and business intelligence tools
  • Have prior experience clearly communicating about data with internal and external customers
  • Are highly motivated to work autonomously, with the ability to manage multiple work streams
  • Interest in building and experimenting with different tools and tech, and sharing your learnings with the broader organization
  • Have developed and maintained DBT or Airflow in production environments
  • Have experience programmatically deploying cloud resources on AWS, Azure, or GCP
  • Have successfully implemented data quality, data governance, or disaster recovery initiatives
  • Are proficient in at least one imperative programming language (i.e., Python)