Software Data Engineer - Payments Analytics Engineering
Summary
The Apple Media Products Engineering team is one of the most exciting examples of Apple’s long-held passion for combining art and technology. These are the people who power the App Store, Apple TV, Apple Music, Apple Podcasts, and Apple Books. And they do it on a massive scale, meeting Apple’s high expectations with dedication to deliver a huge variety of entertainment in over 35 languages to more than 150 countries. These engineers build secure, end-to-end solutions. They develop the custom software used to process all the creative work, the tools that providers use to deliver that media, all the server-side systems, and the APIs for many Apple services. Thanks to Apple’s unique integration of hardware, software, and services, engineers here partner to get behind a single unified vision. That vision always includes a deep commitment to strengthening Apple’s privacy policy, one of Apple’s core values. Although services are a bigger part of Apple’s business than ever before, these teams remain small, nimble, and multi-functional, offering greater exposure to the array of opportunities here.
Are you interested in a career in data? Data drives the direction and strategy for Apple’s growing Services business. As the Payments Analytics Engineering team we collect, curate, and provide insights into payments data from Apple's services, Online Store, and Retail organizations. This data plays a critical role in enabling business growth. We build data pipelines with maximum efficiency, scalability and reliability to allow domain specific engineers to focus on their specialties. Are you passionate about communicating with data optimally? Curious how data is manipulated in real-time? This is the team to join. You’ll be exposed to modern, open-source technologies that are standard to the big data industry, and will work with data at a scale that few organizations in the world have access to. Our data are used to provide the best customer experience while buying Apple products and to optimize various payment methods for Apple. We’ve even built data products and visualizations to be consumed by external payments partners for Apple.
Key Qualifications
- Demonstrate experience in data management and automation on Spark, Hadoop, and HDFS environments
- Experience in designing and developing large scale real-time streaming pipelines using Kafka, Spark Streaming
- Experience with Cloud Kubernetes offerings like AWS EKS
- Experience in building and deploying large scale applications in Cloud based environment
- Experience handling data in relational databases and developing ETL pipelines
- Competency in Java, Scala or similar object-oriented language
- Confidence with SQL databases like Oracle and NoSQL databases like Cassandra
- Experience in developing product features, functional specifications, and development schedules, represent team and technology
- Prior experience designing and implementing outstanding large distributed systems
- Be an advocate for performance optimization, automation, and unit tests
- Ability to pick up new technologies quickly
- Excellent debugging, critical thinking, and communication skills
- Attention to detail
- Solid documentation and technical writing skills
Description
The Payments Analytics team is responsible for collecting, analyzing, and reporting on Payments, Apple Pay/Card and Gift Cards data. From this data we generate insights into how customers get along with Payments products and services, and use these insights to drive improvements to user-facing features. You will be working with a diverse team valuing cooperation, brainstorming, with an emphasis on optimized design. You will be responsible for developing systems, tools, and visualizations to make sense of the data. We are looking for a sharp engineer who also has a keen sense of how to build quality and scalable products. You are also a teammate -- ready to engage in lively design discussions, and able to give and receive constructive code reviews. Your curiosity drives you to explore new technologies and apply creative solutions to problems. The ideal candidate pays close attention to details, but also keeps sight of the bigger picture.
Education & Experience
MS or BS degree in Computer Science or a related field
Additional Requirements
- Proficiency with source control systems (SVN, Git) and build tools such as Gradle, Maven, etc.
- Proven hands on experience with the Big-Data ecosystem (Spark, Hadoop, Scala, Hive, Pig, Kafka etc.)
- Built and deployed large scale data pipelines
- Experience in AI/ML training-model and pipeline building is a big plus
- Understands different data storage solutions and when to use them (e.g. RDBMS, Cassandra, Solr, Redis)
- Experience implementing and administering logging, telemetry and monitoring tools like Splunk is a plus
- Experience in cluster management/orchestration software like Aurora or Ansible & using tools such as Docker is a plus
- Passionate about being a part of a tight-knit fast moving Big data team