Skip to content

Analytics Engineer

Okta LogoOkta
View Organization

We are looking for an Analytics Engineer to join our team in Business Technology (BT) as part of a core, centralized Business Intelligence function serving the entire organization.  In this role, you will be responsible to model the data as well as apply various transformations to different data pieces to make it easier for business analysts and other stakeholders to view and understand the data. You will be part of a team doing detailed designs, development, and implementation of applications using cutting-edge technology stacks.

Our focus is on building platforms and data model that are utilized across the organization by sales, marketing, engineering, finance, product, and operations. The ideal candidate will have a strong engineering and analytical background with the ability to tie analytical and engineering initiatives to business impact.

Responsibilities:

  • Configure scalable and reliable data pipelines to consume, integrate and analyze large volumes of complex data from different sources to support the growing needs of our business
  • Build a data access layer to provide data services to internal and external stakeholders
  • Analyze a variety of data sources, structures, and metadata and develop mapping, transformation rules, aggregations and ETL specifications
  • Help define and manage a next generation data lake and master data architectures to enable information transparency for our clients and data quality across systems.
  • Proactively develop data architectural patterns to improve efficiency
  • Interface with stakeholders to gather requirements and build functionality
  • Support and enhance existing data infrastructure
  • Build data expertise and data quality for areas of ownership
  • Experiment with different tools and technology. Share learnings with the team

Qualifications:

  • BS in statistics, mathematics, computer science, software engineering, or IT
  • 3+ years in a data engineering or analytical role
  • 2+ years in the data warehouse and data lake space
  • 3+ years’ experience working with SQL
  • Expertise in data languages like R and Python
  • Expertise in analytics tools like dbt and data observability platform like Monte Carlo
  • 2+ year’ experience with relational and columnar MPP databases like Snowflake, Athena, or Redshift
  • 2+ years of experience with database and application performance tuning
  • Experience with ETL tools such as Airflow, Oozie, Luigi or Informatica
  • Experience with CI/CD tools and procedures such as Jenkins, Git, Chef, and Ansible
  • Experience with cloud infrastructure/platforms (AWS, Azure or Google Cloud Platform)
  • Excellent oral and written communication skills, both technical and non-technical audience