Senior Data Engineer
Quick Summary
Are you a passionate and ambitious Senior Data Engineer ready to dive into an environment that fosters innovation, continuous learning,
Are you a passionate and ambitious Senior Data Engineer ready to dive into an environment that fosters innovation, continuous learning, and professional growth? We're seeking talented individuals who are eager to tackle complex problems, build scalable solutions, and collaborate with some of the finest engineers in the entertainment industry.
- Complex Projects, Creative Solutions: Dive into intricate projects that challenge and push boundaries. Solve complex technical puzzles and craft scalable solutions.
- Accelerate Your Growth: Access mentorship, training, and hands-on experiences to level up your skills. Learn from industry experts and gain expertise in scaling software.
- Collaborate with Industry Leaders: Work alongside exceptional engineers, exchanging ideas and driving innovation forward through collaboration.
- Caring Culture, Career Development: We deeply care about your career. Our culture prioritizes your growth with tailored learning programs and mentorship.
- Embrace Challenges, Celebrate Success: Take on challenges, learn from failures, and celebrate achievements together.
- Shape the Future: Your contributions will shape the future of entertainment.
You’ll be joining the Data Engineering team on an exciting mission to build a top-tier data platform that powers everything we do. We manage data from our in-house products like Stake and Kick, as well as third-party tools and services, centralizing it and ensuring it’s reliable, scalable, and ready to drive smarter decisions across the business.
We’re focused on advanced data tools and services that truly make a difference. Our goal is to help teams unlock the full potential of data, empowering them to create impactful outcomes for our customers and the business. If you’re someone who loves tackling big challenges and shaping the future of data, you’ll fit right in!
Responsibilities
~1 min read- →Architect, design, and optimise scalable pipelines, orchestrating complex workflows and enabling efficient data processing from multiple sources.
- →Lead the development of secure, scalable, and cost-efficient data infrastructure using AWS services and Terraform, ensuring engineering best practices in infrastructure automation.
- →Collaborate with stakeholders to define and drive strategic data initiatives, including data lakehouse architecture, real-time data processing, and analytics enablement.
- →Own and enhance observability practices, implementing robust monitoring, alerting, and logging systems to minimise downtime and optimise performance.
- →Develop and execute automated data governance processes to ensure compliance with data lineage, classification, access control, GDPR, and other regulatory standards. This includes masking, encryption, and policy enforcement to protect sensitive data, especially in regulated markets.
- →Architect and optimise data pipelines for Change Data Capture (CDC), reconciliation workflows, and reliability enhancements.
- →Drive improvements in engineering best practices, including code quality, system design, documentation, and operational efficiency, to foster team efficiency and high-quality outcomes with measurable business value.
- →Identify opportunities and implement innovative solutions to enhance data platform capabilities, proactively solving performance bottlenecks and improving data reliability.
- →Perform advanced code reviews, mentor engineers and foster a collaborative, high-performance and collaborative engineering culture.
Requirements
~1 min read- A Bachelor’s degree in Computer Science, Software Engineering, Information Systems, or a related field or equivalent practical experience.
- 8+ years of experience in data engineering or a related role, with demonstrated expertise in data modelling, ETL/ELT design, database management, and real-time data pipelines.
- Advanced proficiency in SQL, Python, or PySpark, with deep experience in building data pipelines leveraging AWS services such as Glue, Redshift, Kinesis, Lambda, S3, and DMS.
- Experience with orchestration tools (e.g., Apache Airflow), version control systems (e.g., GitHub), and big data technologies such as Spark or Hadoop.
- Proven expertise in data governance, security, and compliance standards, particularly in regulated environments, including managing lineage, classification, access control, and PII.
- Experience designing and implementing modern cloud-based data platforms, preferably on AWS, using Infrastructure as Code (IaC) tools like Terraform.
- Lead R&D efforts, exploring emerging technologies and evaluating their applications for system enhancement.
- Knowledge of machine learning concepts and their integration with data platforms.
- Strong problem-solving, analytical, and communication skills for engaging with cross-functional teams and addressing complex technical challenges.
- Experience in technical leadership skills, including mentoring teams, providing strategic guidance, and shaping organisational strategies.
What We Offer
~2 min readListing Details
- Posted
- March 18, 2026
- First seen
- March 26, 2026
- Last seen
- April 14, 2026
Posting Health
- Days active
- 19
- Repost count
- 0
- Trust Level
- 45%
- Scored at
- April 15, 2026
Signal breakdown
Please let Easygo know you found this job on Jobera.
4 other jobs at Easygo
View all →Explore open roles at Easygo.
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
No spam. Unsubscribe at any time.