Analytics Engineer II
Who We Are…
When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next…
From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive.
Your New Role...
As an Analytics Engineer, you will contribute to data pipeline, data modeling, and data visualization-related efforts for the Business Intelligence & Analytics team at Warner Bros. Discovery Streaming. You’re an engineer who not only understands how to use big data in answering complex business questions but also how to design semantic layers to best support self-service vehicles. You will work closely with cross-functional partners to ensure that business logic is properly represented in the semantic layer and production environments, where it can be used by the wider Warner Bros. Discovery Streaming team to drive business strategy.
Your Role Accountabilities…
- Design and implement data models that support flexible querying and data visualization
- Advance automation efforts that help the team spend less time manipulating & validating data and more time analyzing it
- Contribute innovative ideas to the Analytics Engineering roadmap
- Rapidly deliver on concepts through prototypes that can be presented for feedback
Qualifications & Experience…
- Bachelor's degree or greater in quantitative field of study (Computer Science, Engineering, Mathematics, Statistics, Finance, etc.) from a top-tier institution
- 3+ years of relevant experience in business intelligence, analytics, and/or data engineering
- Proficiency in writing SQL (clean, fast code is a must) and in data-warehousing concepts such as star schemas, slowly changing dimensions, ELT/ETL, and MPP databases
- Experience in transforming flawed/changing data into consistent, trustworthy datasets, and in developing DAGs to batch-process millions of records
- Experience with general-purpose programming (e.g. Python, Go, Java), dealing with a variety of data structures, algorithms, and serialization formats
- Advanced ability to build reports and dashboards with BI tools (such as Looker and Tableau)
- Proficiency with Git (or similar version control) and CI/CD best practices
- Ability to write clear, concise documentation, and to communicate generally with a high degree of precision
- Ability to solve ambiguous problems independently
- Ability to manage multiple projects and time constraints simultaneously
- Care for the quality of the input data and how the processed data is ultimately interpreted and used