NeoGames is a leader in the iLottery and iGaming space offering solutions spanning game studios, game aggregation, lotteries, online casino, sportsbook, bingo, and managed services offered through an industry leading core platform.
The Data & BI team owns the group’s Data & Analytics platforms spanning Data Engineering,
Analytical Engineering and Business Intelligence to lead the group’s data-driven modernisation both internally and for its clients.
The Data Engineer will play a vital role as part of a cross-functional team to develop data pipelines to ingest, transform, distribute and expose data from the group’s Core Data Lake for integration, reporting, analytics, and automations.
The chosen candidate needs to be passionate about building scalable data models and architecture for consumption by other teams with the aim of making it easy for BI, Analytics, Product and other data consumers to build data-driven solutions, features and insights.
Responsibilities:
- Create data pipelines to ingest data from dissimilar sources (performance, reliability, monitoring).
- Server data models as a product to the entire organization through implementation and debugging.
- Collaborate with the other teams to address data sourcing and provision requirements.
- Coordinate with the Product & Technology teams to ensure all platforms collect and provide appropriate data.
- Liaise with the other Data & Analytics teams to ensure reporting and analytics needs can be addressed by the central data lake.
- Support the Data Quality and Security initiatives by building into the architecture the necessary data access, integrity, and accuracy controls
Requirements:
-
3+ years of experience in Data Engineering
-
3+ years of experience in ETL/ELT development in cloud env.
- Degree in Computer Science, Software Development, Engineering , or a related technical field.
- Proficient in Python and SQL
- Understanding of RDBMS, Columnar and NoSQL engines & performance
- Experience with cloud architecture and tools: Microsoft Azure, Amazon or GCP
- Experience with orchestration tools such as Apache AirFlow or UC4
- Understanding of distributed logging platforms
- Familiarity with DWH modeling will be considered an advantage.
- Familiarity with DevOps methodologies and concepts will be considered an advantage
- Background in stream data processing technologies such as NiFi, Kinesis, Kafka will be considered an advantage
- Exposure to Java will be considered an advantage
- Prior exposure to the Snowflake ecosystem will be considered an advantage
We offer:
- High-level compensation and regular performance based salary and career development reviews
- Possibility to work in a big and successful company
- PE accounting and support
- Medical insurance (health), employee assistance program
- Paid vacation, holidays and sick leaves
- Sport compensation
- English classes with native speakers, training, conferences participation
- Referral program
- Team buildings, corporate events