Skip to content

Analytics Engineer

Children's Hospital of Philadelphia LogoChildren's Hospital of Philadelphia
View Organization

The analytics engineer acts as a bridge between a data engineer and a data analyst. This position is primarily responsible for modeling raw data sets into curated, reusable, trusted data sets which power analytics across the enterprise. These data sets will serve as the single source of truth for data and enable self-service analytics. In addition to the development of data models, this role is responsible for maintaining data quality within these data sets via the use of monitoring, testing, and automation. An additional component of the role is to improve the effectiveness of data analysts and data scientists. This maybe via providing technical expertise in query development, extending data models via the addition of new metrics, and/or consulting on software development practices.  The Analytics Engineer owns the entire workflow of data associated with their domain; data pipeline development, ELT performance, timely loading of data sets, and maintenance.

This role will work within various business units and partner with data analysts and data scientist to obtain a deep understanding of operational data and develop scalable data products which empower data-driven decision making across the enterprise.

Job Responsibilities

  1. Collaborate with business subject matter experts, data analysts, and data scientists to understand/identify the opportunities to develop well-defined, integrated, re-usable data sets which power analytics.
  2. Codify reusable data access patterns to speed up time to insights.
  3. Perform Logical and Physical data modeling with an agile mindset.
  4. Build automated, scalable, test-driven ELT pipelines.
  5. Utilize software development practices such as version control via Git, CI/CD, and release management
  6. Build data products using various visualization, BI tools and data science tools.
  7. Collaborate with Data Engineers, DevOps engineers and architects on improvement opportunities for DataOps tools and frameworks.
  8. Implement data quality frameworks and data quality checks.
  9. Help define analytical product roadmap to drive the business goals and superior quality outcomes.
  10. Work with Data Scientists, Statisticians and Machine learning engineers to implement/scale advanced algorithms to solve health care, operational and quality challenges.
  11. Work independently and effectively manage ones time across multiple priorities and projects.
  12. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and frameworks.
  13. Participate in a shared production on-call support model.
  14. Be a critical part of a scrum team in an agile environment, ensuring the team successfully meets its deliverables each sprint.

Job Responsibilities (Continued)

The department works 80%% remotely, 20%% on site in our Philadelphia offices on an as-needed basis.  

Required Education and Experience

Required Education:

  • Bachelor’s Degree

Required Experience:

  • At least three (3) years of experience working in Data and analytics landscape

Preferred Education, Experience & Cert/Lic

Preferred Education:

  • Bachelor’s Degree in Computer Science, Informatics, Information Systems or another quantitative field

Preferred Experience:

  • Six (6) years of experience working in Data and analytics landscape
  • One (1) year of experience working with at least one of the public cloud platforms such AWS/Azure/GCP

Additional Technical Requirements

  • Strong SQL, Data Modeling and Data Warehousing fundamentals.
  • Experience with software development practices; version control, code review, CI/CD
  • Experience with data integration tools: DBT, Informatica, MS Integration Services etc.
  • Experience with big data toolset: Hadoop, Spark, Kafka, Hive, sqoop etc.
  • Experience working with Business Intelligence Tools (Business Objects) or Visualization tools such as Qlik, Tableau, PowerBI etc.,
  • Experience with stream-processing systems: IBM Streams, Flume, Storm, Spark-Streaming, etc.
  • Good hands-on experience with Linux (RHEL/Debian) operating system
  • Ability to code with other scripting languages such as Python, Bash, groovy etc.,
  • Experience consuming and building APIs
  • Experience utilizing Agile methodology for development