Senior Data Engineer.

Sub banner

Senior Data Engineer

Posted 41 days ago
€60000 + BonusJob Reference: 36410
Our client, a leading company in the iGaming industry, is looking for a Senior Data Engineer to join their team in Tallinn on a hybrid basis.

The Senior Data Engineer role is at the heart of collaborative innovation. In this dynamic, agile environment, you'll join forces with developers, architects, and machine learning engineers to lead the charge in integrating cutting-edge third-party data platforms.

Your mission: to design and implement diverse data pipelines that power smarter, more impactful services for our products. As a key player within our scrum teams, you'll tackle complex challenges and manage vast data sets, all while harnessing the latest big data technologies on our advanced platform.

Our client is the digital powerhouse behind one of the world’s leading entertainment giants, with tens of millions of players globally and an unparalleled ecosystem of products that span sports betting, iGaming, and free-to-play experiences.

Main Responsibilities:

  • Design, build and manage data pipelines
  • Conduct peer reviews of team member’s code.
  • Participate in stand-ups with key stakeholders, providing updates on development progress and flagging potential risks early on.
  • Create clear and concise documentation as needed.
  • Motivate the team to contribute their unique value to projects while adhering to standards and best practices.
  • Identify and implement enhancements to existing solutions in alignment with the department's strategic goals.
  • Lead technical meetings with both stakeholders and engineers.
  • Regularly assess key processes, ensuring they are effectively utilized or identifying areas for improvement in collaboration with TAs, BAs, POs, and SMEs.
  • Contribute to recruitment efforts and support the onboarding of new team members. 

Desired Experience & skillset:

  • 4-7 years of experience in software and data engineering
  • Possess expertise in Data Warehousing, ETL processes, Data pipelines, and Lakehouse architectures.
  • Proficient in Python, knowledge of Java or Scala considered a plus.
  • Experienced with RDBMS, Columnar, and NoSQL databases like MySQL, PostgreSQL, and Elasticsearch.
  • Demonstrate practical experience with cloud platforms such as AWS, Microsoft Azure, or GCP.
  • Familiarity with streaming technologies like Kafka or Pulsar is a valuable asset.