Skip to content

Analytics Engineer

Remitly LogoRemitly
View Organization

Salary

$128,000.00 – $160,000.00

Location

Seattle, WA, United States of America. Telecommuting is available from anywhere in the United States based on manager approval.

The Analytics Engineer will work with cloud-based application deployment platforms, including Forge, and various Python applications, and report directly to the Manager of Data Analytics. Telecommuting is available from anywhere in the United States based on manager approval.

You Will:

  • Responsible for building the data models that underpin our critical operational and strategic data-driven decisions.
  • Create and support analytical models that solve complex business problems.
  • Design new data architecture and modify existing data architecture to meet future needs.
  • Responsible for the improvement of data reliability, efficiency, and quality of data management required for building business intelligence applications and reports.
  • Develop data dictionaries, data models, and other documentation to support the Reconciliation cash flow dataset data infrastructure.
  • Ensure the consistency and accuracy of Reconciliation cash flow dataset across all reporting systems, resolving data quality issues.
  • Collaborate with data engineering teams to develop data pipelines and ETL processes that ensure efficient data ingestion and processing.
  • Analyze data to identify trends, patterns, and anomalies, providing recommendations to inform decisions.
  • Communicate complex data insights and technical information to non-technical partners, both verbally and through data visualizations.
  • Help team to set up alarms in the data architecture for faster resolution of issues.

You Have:

  • Position requires a Master’s degree in Business Analytics, Computer Engineering, Information Systems, or a related field and 2 years of experience with analytics engineering.
  • 1 year of experience with developing and maintaining data warehouse solutions, incorporating star and snowflake schema designs.
  • 1 year of experience with designing, building, and maintaining automated reporting dashboards, and conducting ongoing analysis to enable data-driven decisions across teams.
  • 1 year of experience with source code repositories including GitHub
  • 1 year of experience with query / code optimization and improving pipeline performance by reducing time and complexity.
  • 1 year of experience working with cross-functional teams, engineers, and product managers to define and manage roadmaps and KPIs.
  • 1 year of experience with creating robust data standards for internal and cross-functional teams.
  • 1 year of experience with non-relational databases and data stores.
  • 1 year of experience with CI/CD, DataOps best practices, DynamoDB, GIT, Glue, Lambda, QuickSight, Pandas, PySpark, Python, Scala, Redshift, S3, SQL, and Tableau.