We are looking for a qualified Data Engineer to join our team. Your role will involve designing and maintaining ETL/ELT processes, as well as working with large data sets to ensure their quality and consistency.
Key Responsibilities:
- Develop and maintain new ETL/ELT processes from various sources and databases.
- Automate data processing workflows to increase efficiency.
- Ensure data consistency for accurate analysis.
- Monitor and resolve data quality issues.
- Collaborate with the analytics team to define data requirements.
- Explore and implement new tools and methods for data processing.
- Work closely with the DevOps team to implement reliable data infrastructure solutions.
Requirements:
- At least 3 years of experience as a Data Engineer.
- Experience with ETL tools for large-scale data processing.
- Experience working with large data volumes (100+ billion records).
- Knowledge of ClickHouse and SQL/NoSQL databases, including optimization techniques.
- Strong proficiency in Python.
- Experience with version control systems (e.g., GitLab).
- Experience with workflow management tools such as Apache Airflow and the dbt framework.
- A desire for professional development and learning.
- Ability to work independently and as part of a team.
- Initiative and flexibility in solving tasks.
Preferred Qualifications:
- Knowledge of streaming technologies (e.g., Apache Kafka).
- Experience integrating with external APIs for data collection.
- Experience with DataHub.
We Offer:
- Be part of a strong international team;
- Work on a product used by 3.5 million users monthly;
- Competitive compensation package;
- Payments in Euros;
- Hybrid work format;
- English language courses;
- Competitive salary and additional benefits.
If you are ready for new challenges and want to be part of our team, please send us your resume. We look forward to hearing from you!