We are a leading trading platform that is ambitiously expanding to the four corners of the globe. Our top-rated products have won prestigious industry awards for their cutting-edge technology and seamless client experience. We deliver only the best, so we are always in search of the best people to join our ever-growing talent team.
As a Data Engineer, you will play a crucial role in our data team, contributing to developing and maintaining data pipelines and systems. You will work closely with data scientists, analysts, and other stakeholders to ensure the availability and reliability of data for decision-making and analytics. Your responsibilities will encompass data ingestion, transformation, and delivery, as well as maintaining and optimizing data infrastructure.
The Middle Data Engineer is expected to contribute to developing and maintaining data infrastructure, ensuring data reliability and availability. You should be able to work independently and as part of a team, adapting to changing data needs and collaborating with data professionals to provide valuable insights to the organization. Staying current with data engineering best practices and emerging technologies is essential for success in this role.
Design, develop, and maintain data pipelines that ingest process, and deliver data from various sources, ensuring data quality and reliability.
Data Modeling: Create and maintain data models to support reporting, analytics, and business intelligence needs, optimizing data structures for performance and efficiency.
Implement ETL processes to transform raw data into meaningful insights, handling data transformation, aggregation, and enrichment.
Monitor and address data quality issues, implement data validation processes, and establish data governance practices.
Manage and optimize data storage, processing, and distribution systems, ensuring scalability and performance.
Collaborate with data scientists, analysts, and cross-functional teams to understand data requirements and deliver solutions that meet business needs.
Document data engineering processes, pipelines, and systems to maintain clear and accessible knowledge for team members.
- Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field.
- Minimum of 3 years of experience in data engineering or a related field.
- Proficiency in Python for data pipeline development and scripting.
- Experience with Apache Airflow for workflow orchestration.
- Strong knowledge of Spark and Databricks for big data processing.
- Proficiency in AWS cloud services, including S3, EC2, and EMR.
- Experience with both SQL and NoSQL databases, particularly Postgres and Redshift.
- Familiarity with Kafka for real-time data streaming.
- Familiarity with data visualization and reporting tools (e.g. Tableau, Power BI or Looker)
- Project management skills using Jira or similar tools.
What you will get in return:
- You will join the Company, that cares about work and life balance
- Annual bonus depending on personal performance
- Family Medical Insurance, Pension fund, and Multisport card for CoE
- Full annual performance assessment
- Modern and outstanding equipment
- Employee referral program
- Additional paid days off and the opportunity to work with one of the smartest teams on the market.
If it sounds interesting for you, feel free to apply and we will definitely reach you out! :)