Data Engineer
Join us, and solve real-world challenges with smart data solutions!
Kraków or Warsaw - based opportunity with hybrid work model (8 days/month in the office).
As a Data Engineer you will be working for our client, a global financial services organization, on a high-impact data platform supporting business decision-making and regulatory compliance. You will be contributing to the design and development of scalable data pipelines using big data technologies within an Agile environment. The project involves close collaboration with data analysts, engineers, and business stakeholders to ensure data integrity, system performance, and timely delivery of insights. You'll be engaging in architectural decisions, mentoring peers, and ensuring best practices in data engineering are followed.
Your main responsibilities:
- Building and optimizing scalable data pipelines using Pyspark and Hadoop components
- Designing and developing data processing workflows using Spark, Hive, and SQL
- Collaborating with Business Analysts to interpret and implement data requirements
- Participating in Agile ceremonies including planning, sprint reviews, and retrospectives
- Conducting code reviews and promoting development standards across the team
- Monitoring and troubleshooting production data jobs and workflows
- Integrating scheduling tools such as Airflow to orchestrate workflows
- Implementing automated testing frameworks for data components
- Supporting DevOps processes including CI/CD using Jenkins and Ansible
- Contributing to architectural discussions and technical strategy
You're ideal for this role if you have:
- 5+ years’ experience in data engineering within Agile and DevOps environments
- Strong proficiency in Pyspark or Scala development
- Hands-on experience with Apache Spark, Hive, Hadoop, and YARN
- Solid knowledge of SQL and ETL frameworks
- Experience using version control tools like Git/GitHub
- Proficiency with workflow orchestration tools such as Airflow
- Strong understanding of big data modeling with relational and non-relational databases
- Familiarity with RESTful APIs and integration techniques
- Ability to work on Unix/Linux platforms
- Strong problem-solving skills with experience in debugging data pipelines
It is a strong plus if you have:
- Experience with Elasticsearch
- Knowledge of Java APIs and backend integration
- Exposure to ingestion frameworks and practices
- Familiarity with Cloud architecture and design patterns
- Understanding of Agile methodologies such as Scrum and Kanban
We offer you:
ITDS Business Consultants is involved in many various, innovative and professional IT projects for international companies in the financial industry in Europe. We offer an environment for professional, ambitious, and driven people. The offer includes:
- Stable and long-term cooperation with very good conditions
- Enhance your skills and develop your expertise in the financial industry
- Work on the most strategic projects available in the market
- Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years
- Participate in Social Events, training, and work in an international environment
- Access to attractive Medical Package
- Access to Multisport Program
- Access to Pluralsight
- Flexible hours & remote work
Internal job number #6839
You can report violations in accordance with ITDS’s Whistleblower Procedure available here.