Senior Analytics Engineer
Salary
The base salary range for candidates within the San Francisco Bay Area is $170,000 - $230,000 USD
The Data team builds and runs distributed systems and tools supporting Intercom by empowering people with information. As the company grows, so does the volume and velocity of our data along with the appetite for increasingly sophisticated and specialized data solutions.
Our team builds, maintains, evolves, and extends the data platform, enabling our partners to self-serve by creating their own end-to-end data workflows, from ingestion through transforming data and evaluating experiments to analyzing usage and running predictive models. We provide the data foundation to support many highly impactful business and product-focused projects.
We’re looking for a Senior Analytics Engineer to join us and collaborate on data-related initiatives, who is passionate about making quality data available for our stakeholders.
We currently operate in a hybrid working model and this role is open to our San Francisco or Dublin, Ireland offices.
What will I be doing?
- Be a “Curator” of the Data Warehouse. Provide clean data sets to end users, modeling data in a way that empowers end users to answer their own questions. This includes working on Intercom’s Enterprise Data Model, Metrics- and other frameworks.
- Optimize dbt models
- Be a thought leader in refining the ways we are developing, testing, deploying, organizing and documenting dbt models and our data warehouse.
- Help evolving the Data Platform by contributing to the design and implementation of the next generation of the stack, both from infrastructure and from data modeling point of view.
- Collaborate with product managers, go-to-market teams as well as analysts and data scientists to build automation and tooling to support their needs in an environment where dozens of changes can be shipped daily. This includes a lot of quality-of-life tooling to automate away daily toil to help everyone focus on high value added tasks.
- Implement systems to monitor what we have built, to detect and surface both bottlenecks and problems with the infra and data quality issues.
What skills do I need?
- You have a very strong understanding of SQL, and significant experience in data modeling, warehouse design.
- Strong professional experience or understanding of tools and technologies that are in our stack, such as Snowflake, DBT, or equivalent technologies.
- Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would be helpful.
- You are aware of the importance of data security and are passionate about privacy.
- You can demonstrate the impact of your work.
Bonus skills & attributes
- You have years of full-time, professional work experience using a modern programming language on a daily basis. Strong preference for Python