Skip to content

Senior Analytics Engineer

Thrive Market LogoThrive Market
View Organization

Salary

The base salary range for this position is $150,000 - $190,000/Per Year.

Thrive Market’s Data Engineering team is expanding, this time we need to bring on an Analytics Engineer to be responsible for the design, development, and maintenance of data models that power Thrive Market's analytics platforms. As a data professional, your retail domain expertise will be pivotal in driving systems that generate first-party data. Additionally, they are also able to provide data solutions that help solve our challenges and data needs.  You will help us execute our Data Engineering & Analytics initiatives and turn them into products that will provide great value to our members. In this role, we are hoping to bring someone in who is equally excited about our mission, learning the tech behind the company, and has the ability to mentor, lead, and share their expertise with other team members.

Responsibilities:

Data Modeling

  • Create and sustain an Enterprise Data Model (EDM) that acts as a comprehensive framework for both strategic and tactical planning for the management of the enterprise data warehouse
  • Design, implement, and enforce data modeling best practices, standards, and guidelines to ensure consistency across different data sources and projects. This includes applying Thrive Market data naming standards and documenting data model translation decisions
  • Applying a forward engineering approach to design and build databases for OLAP models, coupled with a solid grasp of data warehousing design patterns and modeling techniques
  • Collaborate with business stakeholders and subject matter experts to identify tangible metrics, and data entities to create data models for self-service data and reporting needs
  • Lead data governance activities - metadata management, data lineage, and data dictionary maintenance
  • Work closely with data engineers to validate data model designs and provide guidance during the implementation phase
  • Design and implement optimized data models to support business requirements while ensuring models are optimized for performance and scalability
  • Document data models and data architecture artifacts; develop and maintain a data dictionary and metadata repository
  • Create source-to-target mapping documents to enable data engineers to build data pipelinesConduct codebase reviews of data pipelines & reporting datasets to verify data lineage and validate conclusions drawn from data insights

Data Architecture

  • Facilitate joint Architecture Design sessions to determine data rules and hold logical data model and physical data model reviews with Data stakeholders
  • Identify strategic opportunities to rationalize data sets and views that from logical hierarchies reduce redundancy and data movement
  • Deliver solutions that leverage robust data pipelines and data models to support reporting use cases for the enterprise
  • Troubleshoot and resolve data-related issues and failures in the production environment
  • Guide, educate, and mentor the Data Architecture Strategy directives, principles, & standards with Data Engineers and Data Analysts by emphasizing methodology, modeling, and governance with the data team to create best practices
  • Capture and maintain metadata, creating business rules for the use of data
  • Provide direction, guidance, and oversight for data quality controls by performing data profiling and analysis to understand data quality and completeness
  • Capture and maintain metadata, creating business rules for the use of data
  • Identify opportunities for standardizing data descriptions, integration and archiving, and elimination of unnecessary redundancy.

Qualifications:

  • Bachelor's degree in Computer Science, Information Systems, or a related field. A master's degree is a plus
  • At least 5 years as a Data Engineer, Analytics Engineer, or Data Architect, designing and implementing data models for enterprise-scale applications
  • Proficient hands-on experience developing and managing data pipelines with Airflow, working with cloud data warehouses like Snowflake, transforming data using DBT, and building applications and automation using Python
  • Expert in writing, optimizing, and troubleshooting advanced SQL queries
  • Experience with major cloud platforms (AWS, GCP, Azure) and container technologies like Docker for development, deployment, and orchestration
  • Experience with data integration techniques, including ETL/ELT processes, data warehousing, and data lakes
  • Strong understanding of Star or Snowflake schema, and related concepts such as datamarts and cubes
  • Strong understanding of the analytics development lifecycle from requirements, logical and physical data model design, implementation, testing, deployment, and support
  • Knowledge of data governance principles and practices, including data quality management and metadata management
  • Excellent analytical thinking and problem-solving skills with the ability to translate complex business needs into actionable data solutions
  • Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams and present technical concepts to non-technical stakeholders
  • Proven ability to thrive in fast-paced dynamic work environments and effectively manage multiple priorities.