Senior Data Analytics Engineer
About Vouch:
Insurance... sounds slow, old-fashioned, and unexciting. Exactly. Insurance is broken, and it's failing fast-moving, innovative startups.
Vouch is a new, technology-first insurance company backed with $160M in funding from world-class investors. Like Stripe for payments or Brex for credit cards, Vouch is creating the go-to business insurance for high-growth companies.
We're doing this by making insurance fast, responsive, and focused on our customers -high growth and innovative companies. Instead of printed PDF applications and week-long waits, Vouch is building new technology to solve real problems, writing policies that actually cover relevant startup scenarios, and designing simple experiences in an otherwise frustrating industry.
What does a work environment look like at Vouch?
Vouch is a Virtual First Workplace with office locations in SF, Chicago, and NYC . This role can be based anywhere in the U.S as long as you can work our Vouch core collaboration hours (8:30 am-2:30 pm Pacific Time.)
The Job:
Vouch is looking for a Data Analytics Engineer to join our team. We’re looking for someone who lives and breathes data and gets excited about data warehouses, building infrastructure to manage fast-changing data sets, and maintaining data quality. You will have the opportunity to contribute to Vouch’s data assets and help to shape fundamental aspects of our data-driven culture.
Role Responsibilities:
- Working with our business operations, finance, sales, marketing, product, and insurance teams to design and build data sets to drive Vouch’s business
- Setting up and maintaining timely and reliable ingestion of external data sources via our data loading platforms.
- Providing clean, transformed data that is ready to be piped to our product, CRM, Finance, Underwriting, and other relevant systems
- Clearly documenting data models with source, description and field definitions for better collaboration, maintainability and usability.
- Establishing engineering best practices and methodologies to ensure data transformations and computations are accurate, efficient, and tested.
- Working with dbt, Snowflake, Airflow, Python, Stitch, and git, and we welcome new ideas.
About You:
- 3+ years experience developing ETL workflows as a data analytics engineer, data engineer, or data analyst
- Expert in SQL, capable in Python, and experience with Business Intelligence tools such as Looker, Mode Analytics or Tableau
- Experience building business reporting processes (e.g. for finance, sales, or business operations)
- Uses software development best practices with a focus on testing, reliability and maintainability
- Experience working with cloud-based data warehouses like Snowflake, Redshift, or BigQuery
- Experience with data transformation tooling (dbt is preferred), data pipeline services like Stitch and Fivetran, and orchestration tools like Airflow
- You can effectively talk (and listen) to engineers, designers, executives, and other stakeholders.
Nice to Have:
- Exposure to and passion for early-stage startups and/or high growth environments
- Previous experience working in user segmentation, attribution, funnel conversion, or similar business analytics problems
- A background in insurance or other regulated categories