Analytics Engineer
We are looking for an Analytics Engineer to join our Data team. The Data team is responsible for building a trusted Data Practice at DroneDeploy that aims for self-service analytics and reporting to non-experts on a single-source-of-truth data warehouse that covers Product, Sales and Marketing data. We also build and maintain customer-facing embedded dashboards. We do this by applying software engineering best practices to the production and maintenance of analytics code to clean and transform raw data into consumable information and business logic on our modern data stack (Fivetran, BigQuery, dbt, GitHub, Datafold, Tableau, Hightouch).
As a key contributor to our growing team, you will play a pivotal role in shaping the Data Culture and Processes at DroneDeploy. This role invites you to take ownership of Data Modeling for our Data Warehouse and Data Marts, construct and sustain Data Pipelines, and seize exciting opportunities to contribute to the creation of Dashboards and perform Analysis on results. We are seeking an individual with aspirations to lead the team in the future, overseeing our Agile SCRUM processes and playing a key role in the team's growth and management over time.
You'll be successful in this role if you're a strong analytical thinker who enjoys developing scalable data models for end consumers. You will need to be comfortable in a startup environment, juggling requests from both technical and non-technical stakeholders. You’ll also need to be an effective communicator, sharing results in a clear and concise way, and pushing for clear ownership and next steps on key projects.
We value initiative and taking ownership of problems.
Responsibilities
- Partner with Sales, Marketing, Product and Finance stakeholders to translate questions into scalable data models in our Data Warehouse/Marts (we use dbt)
- Write well-documented, human readable and optimized SQL (we use BigQuery)
- Manage our Development Process (Agile SCRUM in 2-week sprints)
- Manage Projects through clear role assignments, following up on projects in play, and
- Facilite Roadmap Planning (we use RICE scoring and OKRs)
- Develop in local environments with Version Control (we use GitHub, Docker and VS Code)
- Write and champion documentation for complex models and business logic and style guides (dbt Docs, Confluence)
- Run queries to support ad-hoc questions and create reports and other visualizations for stakeholders (we use connected G-Sheets, Data Studio, Tableau - open to Hex/Jupyter Notebooks, Mode etc.)
- Manage ETL/ELT Pipeline (we use Fivetran/Segment)
- Assist in the building of reports and dashboards using data visualization and BI tools (we use Tableau)
Requirements
- A degree in a quantitative/analytical discipline: statistics, operations research, computer science, informatics, engineering, applied mathematics, economics, etc.
- 3+ years experience as a Data/Analytics Engineer
- Expert in SQL (ideally BigQuery syntax) and dbt
- Experience collaborating with Product and/or Go-To-Market teams.
- Excellent written and verbal communication skills.
- Experience successfully managing complex projects from understanding requirements though to delivery and documentation.
- Familiarity with Agile Methodologies and processes (we use Agile/SCRUM in JIRA)
- Able to work core team hours (Monday - Friday, 9am to 4pm PT)
- Able to travel domestically for work (training, meetings, company events)
Bonus Skills, Experience, Interests
- Familiarity with NoSQL (we use MongoDB for our primary Product Database)
- Understanding of CI/CD best practices (we have Github Webhooks -> dbt Cloud, and Datafold for observability)
- Experience working with data visualization tools such as Tableau, Data Studio, SiSense, Looker, Mode, Metabase, Superset et al.
- Solid programming skills with Python data stack libraries and tools such as pandas, Numpy, Matplotlib, etc. and familiarity with Jupyter Notebooks for Exploratory Data Analysis.
- Proficient in Git
Stay informed about the latest analytics engineering opportunities. Subscribe to our weekly newsletter.