Analytics Engineer
Location
Remote
The Analytics Experience team is committed to empowering other Zepz teams to leverage data in order to solve analytical problems. We are a team of analytics and data engineers that work together to go from ideation to production. We own the core semantic layer for the data organization, creating a Zepz view of the WorldRemit and Sendwave brand data. As a member of this hub and spoke data team, you'll be at the forefront of the Zepz data transformation as we continue to expand our core models and assist in development of individual domain models. By translating complex data into actionable insights, you will empower our brands to make strategic decisions and extend their positive impact on a global scale.
This role will report to the Senior Engineering Manager and collaborate with various parts of the Zepz organization. Together you will build new data models and optimize existing ones to drive insights and recommendations from our data. The ideal candidate will be responsible for developing, maintaining, and optimizing data pipelines and analytical solutions that drive business insights and decision-making. You will work closely with data analysts, data scientists, and other stakeholders to ensure data accuracy, availability, and usability.
As an Analytics Engineer, you will own:
- Developing, testing, and implementing data models to ensure data integrity and performance using DBT.
- Collaborating with cross-functional teams to understand data needs and develop solutions to meet those needs.
- Contributing to and following best practices for data consumption, including educating data consumers on data quality, availability and interpretation
- Optimizing existing data processes and pipelines to improve efficiency and reduce latency.
- Identifying opportunities to reduce complexity and increase efficiency across data models and our data warehouse
- Troubleshooting and resolving data issues, ensuring data quality and reliability.
- Ensuring the data and output is of high quality - tested, automated, scalable and documented
What we’re looking for from you:
- Experience with DBT to design and implement data models.
- Experience using a modern data warehouse (e.g., Databricks, Snowflake, BigQuery) for scalable and efficient data processing.
- Proficiency in SQL and experience with database technologies (e.g., PostgreSQL, MySQL, Redshift).
- You have the ability to work confidently with the tools of modern software engineering, for example working with the command line, version control, testing and performing code reviews
- You have previous use of any orchestration tools similar to Airflow, DBT Cloud or Fivetran.
- Familiarity with data visualization tools (e.g., Mode, Tableau, Looker).
- Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and their data services.
- You’re comfortable, or have an interest in, reading and extracting data transformation processes from Python scripts.
- You see yourself as a problem-solver who wants to understand the business problem and communicate the commercial impact alongside the technical solution
- Strong communication skills and the ability to work effectively in a distributed team environment.
- You are comfortable collaborating across multiple time zones.
- You have an open mind with respect to diversity and inclusivity. Our team (and customers) come from all over the world