Analytics Engineer, Housing Team
The DAHLIA Affordable Housing project allows residents to learn about and apply for affordable housing opportunities in one place. Before the debut of DAHLIA, San Franciscans needed to monitor multiple sources of information for listings and had to fill out a lengthy and varied paper application for each new housing opportunity.
DAHLIA has also freed MOHCD staff from a lot of the manual work previously required to input applications, run the housing lottery, and get applicants into housing units.
Job Description
Over the next 36 months, the Housing Team Analytics Engineer will develop data models and pipelines to support MOHCD reporting needs related to DAHLIA. They will work with MOHCD staff to understand data and reporting needs and develop a reporting data infrastructure to support the building of reports and other data products. They will also help automate data reporting processes and produce documentation to ensure proper use of data and encourage self-service use of data and reporting products.
The Analytics Engineer will be responsible for maintaining, developing, and coordinating data and analytics engineering services to support the Mayor’s Office of Housing and Community Development. They will also help create new datasets and data products to meet MOHCD’s reporting needs, starting with affordable housing rental.
Build analytics pipelines and develop data visualizations
- Work with MOHCD staff to understand data and reporting needs.
- Architect, build, and launch efficient and reliable data models and pipelines to create ‘source of truth’ tables and help MOHCD make better use of their data.
- Build tools for auditing, error logging, and validating data.
- Design and develop data visualizations that meet reporting needs and enable self-serve data consumption.
- Apply an ethical lens to the appropriate use of data.
Improve data services
- Continuously assess and improve our suite of data automation services, including identifying opportunities for self-service, assessing new and emerging technologies and streamlining existing business processes.
- Work in collaboration with DataSF to evaluate and implement new tools and approaches to improve analytics pipelines.
- Help identify the need for new data services and support the creation and deployment of future services.
- Develop automation patterns that enable existing data systems to leverage cloud-scale analytics platforms safely and securely (the City’s open data portal, Snowflake, and dbt).
- Provide clear thinking and tradeoffs on adopting new technologies and approaches that balance long term vision with day-to-day operations.
Minimum Qualifications
These minimum qualifications establish the education, training, experience, special skills and/or license(s) which are required for employment in the classification. Please note, additional qualifications may apply to a particular position and will be stated on the exam/job announcement.
1. An associate degree in computer science or a closely related field from an accredited college or university OR its equivalent in terms of total course credits/units [i.e., at least sixty (60) semester or ninety (90) quarter credits/units with a minimum of twenty (20) semester or thirty (30) quarter credits/units in computer science or a closely-related field]; AND
2. Three (3) years of experience analyzing, installing, configuring, enhancing, and/or maintaining the components of an enterprise network.
Substitution: Additional experience as described above may be substituted for the required degree on a year-for-year basis (up to a maximum of two (2) years). One (1) year is equivalent to thirty (30) semester units/ forty-five (45) quarter units with a minimum of 10 semester / 15 quarter units in computer science or a closely related field.
Desirable Qualifications
Personal Skills
- Enjoys collaborative processes and developing shared understanding.
- Ability to communicate with technical and non-technical audiences.
- Investigative ability and intellectual curiosity.
- Excellent oral and written communication skills.
- Ability to learn and embrace new technologies.
- Demonstrated ability working with diverse groups of stakeholders.
- Comfort with risk and trying new things.
- Ability to work independently and as part of a small team.
- Familiarity with Agile development methodology and experience working in iterative development cycles.
- Commitment to equity and the use of data to meet the needs of all San Franciscans.
Technical/Knowledge Skills
- Data analytics and engineering.
- Expert in at least one programming language for data analysis (e.g. Python, R).
- Strong SQL skills.
- Experience in schema design and dimensional data modeling.
- Experience working with a variety of databases, APIs, and formats to extract and transform data.
- Experience deploying code on cloud-based infrastructure (Azure or AWS preferred).
- Experience training non-technical users to use technology to support their work.
Analytics
- Experience in data cleaning and manipulation for analytics use.
- Strong quantitative analysis skills.
- Strong data visualization skills.
- Experience using business intelligence tools (e.g. PowerBI, Tableau) to develop visualizations.
Desirable but not required:
- Experience with Snowflake/dbt.
- Familiarity with Microsoft’s Azure Cloud Tools and/or PowerBI.
- Experience configuring, loading data into, and extracting data and insights from customer relationship management (CRM) tools (e.g. Salesforce)
Additional Information
Compensation: $132,574 - $166,790 annually
Applicants are encouraged to apply immediately as this recruitment may close at any time but not before January 27, 2023.
Stay informed about the latest analytics engineering opportunities. Subscribe to our weekly newsletter.