The Role
We are seeking expert and hardworking engineers to join our collaborative and innovative team. Zynga’s mission is to “Connect the World through Games” by building a truly social experience that makes the world a better place. The ideal candidate needs to have a strong focus on building high-quality, maintainable software that has global impact.
The Analytics Engineering team is responsible for all things data at Zynga. We own the full game and player data pipeline - from ingestion to storage to driving insights and analytics. A Software Engineer looks after the software design and development of quality services and products to support the Analytics needs of our games. In this role, you will be part of our Analytics Engineering group focusing on sophisticated technology developments for building scalable data infrastructure and end-to-end services which can be used by the various games. We are a 120+ organization servicing 1500 others across 13 global locations.
Your responsibilities will include
-
Build and operate a multi PB-scale data platform.
-
Design, code, and develop new features/fix bugs/enhancements to systems and data pipelines (ETLs) while adhering to the SLA.
-
Identifying anomalies, inconsistencies in data sets and algorithms and flagging it to the relevant team and / or fixing the bugs in the data workflows where applicable.
-
Follow the best engineering methodologies towards ensuring performance, reliability, scalability, and measurability.
-
Collaborate effectively with teammates, contributing to an innovative environment of technical excellence.
You will be a perfect fit if you have
-
Bachelor’s degree in Computer Science, or a related technical discipline (or equivalent).
-
2+ years of strong data engineering design/development experience in building large-scale, distributed data platforms/products.
-
Advanced coding expertise in SQL & Python/JVM-based language.
-
Exposure to heterogeneous data storage systems like relational, NoSQL, in-memory etc.
-
Knowledge of data modeling, lineage, data access and its governance.
-
Proficient in AWS services like Redshift, Kinesis, Lambda, RDS, EKS/ECS etc.
-
Exposure to open source software, frameworks and broader powerful technologies (Airflow, Kafka, DataHub etc).
-
Shown ability to deliver work on time with attention to quality.
-
Excellent written and spoken communication skills and ability to work optimally in a geographically distributed team environment.
We encourage you to apply even if you don’t meet every single requirement. Your unique perspective and experience could be exactly what we’re looking for.