All offersWrocławDataData Engineer Middle/Senior
Data Engineer Middle/Senior
Data
NeoGames

Data Engineer Middle/Senior

NeoGames
Wrocław
Type of work
Full-time
Experience
Mid
Employment Type
Permanent
Operating mode
Remote

Tech stack

    Airflow
    regular
    Python
    regular
    SQL
    regular
    DBT
    regular
    Cloud Services
    regular

Job description

Online interview
Friendly offer

NeoGames is a leader in the iLottery and iGaming space offering solutions spanning game studios, game aggregation, lotteries, online casino, sportsbook, bingo, and managed services offered through an industry leading core platform.

The Data & BI team owns the group’s Data & Analytics platforms spanning Data Engineering,

Analytical Engineering and Business Intelligence to lead the group’s data-driven modernisation both internally and for its clients.

The Data Engineer will play a vital role as part of a cross-functional team to develop data pipelines to ingest, transform, distribute and expose data from the group’s Core Data Lake for integration, reporting, analytics, and automations.

The chosen candidate needs to be passionate about building scalable data models and architecture for consumption by other teams with the aim of making it easy for BI, Analytics, Product and other data consumers to build data-driven solutions, features and insights.


Responsibilities:

  • Create data pipelines – both as batch and in real-time - to ingest data from dissimilar sources
  • Collaborate with the other teams to address data sourcing and provision requirements
  • Design and monitor robust, recoverable data pipelines following best practices with an eye out for performance, reliability, and monitoring
  • Innovation drives us - carry out research and development and work on PoCs to propose, trial and adopt new processes and technologies
  • Coordinate with the Product & Technology teams to ensure all platforms collect and provide appropriate data
  • Liaise with the other teams to ensure reporting and analytics needs can be addressed by the central data lake
  • Support the Data Quality and Security initiatives by building into the architecture the necessary data access, integrity, and accuracy controls


Requirements:

  • 3+ years of experience in Data Engineering
  • Degree in Computer Science, Software Development or Engineering
  • Proficient in Python. Past exposure to Java will be considered an asset
  • Understanding of RDMS, Columnar and NoSQL engines & their performance
  • Experience with cloud architecture and tools: Microsoft Azure, Amazon or GCP
  • Experience with orchestration tools such as Apache AirFlow, dbt
  • Prior exposure to the Snowflake ecosystem will be considered an asset
  • Familiarity with Docker/Kubernetes and containerisation
  • Strong background in stream data processing technologies such as NiFi, Kinesis, Kafka
  • A grasp of DevOps concepts and tools including Terraform and Ansible are an advantage
  • Understanding of distributed logging platforms - ideally the ELK stack


Skills:

  • Fluency in spoken and written English is essential
  • Passionate about data and on the lookout for opportunities to optimise
  • Passionate about technology and eager to trial and recommend new tools or platforms


We offer:

  • High-level compensation and regular performance based salary and career development reviews
  • Possibility to work in a big and successful company
  • PE accounting and support
  • Medical insurance (health), employee assistance program
  • Paid vacation, holidays and sick leaves
  • Sport compensation
  • English classes with native speakers, training, conferences participation
  • Referral program
  • Team buildings, corporate events