Analytics Engineer
Company Description
Do you get excited about bringing data together for new insights and improved decisions? How about contributing to making San Francisco a better place to live, work, and play? Looking to use your skills to make a positive social impact? Then come join DataSF to empower the use of data in government!
DataSF is a small, growing team working across the City and County of San Francisco. We are a national leader in open data and our vision is to empower data-driven and impactful decision making and service delivery by making it easy for city departments and the public to share and consume high quality, trustworthy data and derive actionable insights. We work to streamline data access through light, agile data infrastructure, improving data management and governance, boosting capacity to use data through training and data science, and connecting it all together empathetically and ethically for the greater good of San Franciscans.
The City is flush with data - your mission will be to connect, transform, and automate data sharing to support City departments and the people they serve. DataSF offers data/analytics engineering services to departments to ensure the timely and efficient publication of data to the City’s open data platform as well as to support data science. Our data platform is used by department staff and the public to support transparency, support equity in services and programs, automate reporting and develop applications. You’ll also help provide data science services to departments through the development of sustainable analysis pipelines. You can learn more about the work that goes into open data in the four part blog series on open data operations and on operating a data science program.
Analytics engineering is a critical part of keeping data fresh, standardizing datasets, and offering value-added data transformations to City departments that improve services to the residents of San Francisco. Along with our lead analytics engineer, you will support our engineering services for City departments and will help continually improve practices. You will take a supporting role in developing and executing modern analytic engineering patterns for the City. We seek someone that is excited to empower use of data, enthusiastic about open data, and a continuous learner.
Removing barriers and making it easier for all people to access services or knowledge is a core part of any role at DataSF. Beyond any technical skill set or prior work history, accomplishing this ambitious task requires an empathetic understanding of the diverse array of experiences embodied in San Francisco. Your own life experience is a critical contribution to this effort. DataSF is committed to building a team whose diversity reflects the residents we serve.
This is an exciting position for someone eager to harness the power of data to improve transparency, citizen engagement, and government performance; someone who is excited by DataSF’s mission of empowering the use of data in decision making and service delivery.
Job Description
The 1042 Analytics Engineer is responsible for maintaining, developing, and coordinating engineering services to support City data sharing via the City’s data platform as well as assisting our data science work. You will:
Improve the data services DataSF offers to departments
- Help implement new automation patterns that leverage cloud analytics platforms (including Snowflake and dbt) to publish new datasets to the open data portal more efficiently and reliably
- Update and improve documentation to support both our own internal operations as well as self-service data automation for other departments
- Continuously assess and help improve our suite of data automation services by evaluating new and emerging technologies, streamlining existing business processes, and identifying opportunities for automation and self-service tooling
- Support the building and deployment of new data services for departments
Build analytics pipelines to support data-driven work
- Work with the team to develop extract, transform, load (ETL) requirements for individual datasets and consult with departments on the best way to automate and publish datasets
- Apply an ethical lens to the appropriate use of data
- Create new analytics pipelines using ETL/ELT approaches according to standards and patterns you help develop and refine
- Implement analytics pipelines and/or data models to support data science and data analytics work as needed
Maintain existing data pipelines
- Monitor existing data automations developed on our legacy infrastructure (Safe Feature Manipulation Engine (FME) Server), respond to incidents, and manage updates
- Migrate existing data automations to leverage cloud analytics tools (dbt, Snowflake, etc.)
- Analyze pipeline throughput, issues, and other metrics to inform improvements to the automation platform
Qualifications
1. Possession of an associate degree in computer science or a closely related field from an accredited college or university OR its equivalent in terms of total course credits/units [i.e., at least sixty (60) semester or ninety (90) quarter credits/units with a minimum of twenty (20) semester or thirty (30) quarter credits/units in computer science or a closely-related field]; AND
2. One (1) year of experience analyzing, installing, configuring, enhancing, and/or maintaining the components of an enterprise network.
Substitution: Additional experience as described above may be substituted for the required degree on a year-for-year basis (up to a maximum of two (2) years). One (1) year is equivalent to thirty (30) semester units/r forty-five (45) quarter units with a minimum of 10 semester / 15 quarter units in computer science or a closely related field.
Desirable Qualifications
Personal Skills
- Excellent oral and written communication skills
- Investigative ability and intellectual curiosity
- Ability to learn and embrace new technologies
- Familiar with the principles and concepts of open data
- Comfort with risk and trying new things
- Ability to work independently and as part of a small team
- Enjoys collaborative processes and developing shared understanding
- Strong organization skills
Technical/Knowledge Skills
- Experience in data manipulation and analytical thinking
- Experience writing and maintaining ETL/ELT code, especially creating and deploying through a framework
- Programming proficiency in Python and SQL
Bonus points if you have:
- Experience with Snowflake/dbt.
- Familiarity with Microsoft’s Azure Cloud Tools and/or PowerBI.
- Experience configuring, loading data into, and extracting data and insights from customer relationship management (CRM) tools (e.g. Salesforce).
- Experience training non-technical users to use technology to support their work
- Strong quantitative analysis skills
- Strong familiarity with geospatial data and best practices
- Experience translating business needs into technical implementations, including mapping out business processes and data models
- Experience working with a variety of databases, APIs, and formats to extract and transform data
Stay informed about the latest analytics engineering opportunities. Subscribe to our weekly newsletter.