Become part of a team working on some of the most rewarding, large-scale creative projects to be found in any entertainment medium - all within an inclusive, highly-motivated environment where you can learn and collaborate with some of the most talented people in the industry.
Rockstar Games is on the lookout for a passionate Analytics Engineer with strong software development skills who possesses a passion for both games and big data.
This is a full-time, in-office position based out of Rockstar’s NYC headquarters in Downtown Manhattan.
WHAT WE DO
• The Rockstar Analytics team provides insights and actionable results to a wide variety of stakeholders across the organization in support of making judgements while leveraging data to measure and improve on the success and health of our games.
• We collaborate as a global team to develop cutting-edge data pipelines, data products, data models, reports, analyses, and machine learning applications.
• The Analytics Architecture vertical within the Analytics team is tasked to build out high impact data models and internal tooling to support the global Analytics team while maintaining the foundational infrastructure of Rockstar’s Player Analytics data platform.
RESPONSIBILITIES
• Build, spearhead, and provide technical mentorship for a team of analytics engineers by sharing technical expertise to support the successful execution of analytics projects.
• Partner with data scientists and business stakeholders to understand analytical needs and translate them into robust data infrastructure and workflows for the global analytics team to leverage for deriving insights.
• Collaborate with data engineering team to architect overall infrastructure model.
• Design, develop and maintain scalable data pipelines. Monitor performance and optimize data systems for high performance and reliability.
• Collaborate with the Data Integrity team to architect & implement a Data Quality Framework that ensures production data meets SLAs for key stakeholders and business processes.
• Manage timely Root Cause Analysis to troubleshoot data-related issues; assist in implementation of code and process fixes.
• Provide guidance and collaborate with other team members to continue to scale our architecture to evolve for the needs of tomorrow. Establish and promote best practices including code standards, version control, documentation, testing, and review processes.
• Develop and support CI/CD processes using Terraform, GitHub, TeamCity, Octopus, etc.
QUALIFICATIONS
• Bachelor’s degree or equivalent in an engineering or technical field such as Computer Science, Mathematics, Statistics, or strong quantitative and software background preferred.
• 8+ years of experience in analytics engineering or data engineering, with at least 3 years in a management role.
• Proven track record in architecting, building, monitoring, and optimizing large-scale data systems and analytics infrastructure.
• 8+ years of hands-on experience in Python & 4+ years of hands-on experience in PySpark.
• 8+ years of hands-on experience in using advanced SQL queries (analytical functions), experience in writing and optimizing highly efficient SQL queries.
• Experience working in Databricks & Azure environment.
• Experience working with pipeline scheduling tools such as Airflow & Astronomer.
• Experience working with CI/CD tools such as TeamCity, Terraform, Github, Octopus.
• Knowledge of software coding practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations.
SKILLS
• Proven ability to reconcile technical and business perspectives.
• Proven ability to develop and maintain good relations and communicate with people at all levels.
• Ability to push the frontier of technology and freely pursue better alternatives.
• Ability to maintain focus and develop proficiency in new skills rapidly.
• Ability to utilize problem solving skills in a fast-paced environment.
PLUSES
Please note that these are desirable skills and are not required to apply for the position.
• Experience with Python & Pyspark.
• Experience with SQL.
• Experience with GitHub.
• Experience with data modeling for data warehousing.