Data Engineer
Join us, and create cutting-edge pipelines for seamless data transformation!
Kraków - based opportunity with hybrid work model (2 days/week in the office).
As a Data Engineer, you will be working for our client, a global financial institution that is driving DevOps transformation through data analytics and engineering. You will be part of a team that provides key metrics and analytical products to enhance software engineering practices across the organization. Your role will focus on developing data transformation pipelines, ensuring data quality, and supporting a cloud data platform to improve the overall DevOps experience. You will collaborate with diverse global teams to deliver enriched datasets, dashboards, and insights that enable strategic decision-making.
Your main responsibilities:
- Designing, developing, testing, and deploying data ingest, quality, refinement, and presentation pipelines
- Operating and iterating on a cloud data platform to support internal goals
- Building and maintaining ETL processes and data transformation pipelines
- Ensuring data quality and implementing automated data validation solutions
- Developing data marts and optimizing schema designs for performance and usability
- Collaborating with business stakeholders to understand data needs and deliver actionable insights
- Working with cloud-based big data technologies, particularly Google Cloud Platform (GCP) and BigQuery
- Utilizing orchestration and scheduling tools such as Airflow and Cloud Composer
- Supporting continuous integration and continuous delivery (CI/CD) processes
- Following Agile methodologies and working within a product-oriented culture
You're ideal for this role if you have:
- At least 7 years of professional experience in SQL development
- Strong experience in data engineering and ETL processes
- Expertise in GCP, BigQuery, and data build tools (DBT)
- Hands-on experience with Apache Airflow and Cloud Composer
- Proficiency in data modeling and designing optimized data schemas
- Experience with data streaming technologies such as Kafka
- Familiarity with BI tools, especially Looker Studio
- Understanding of DevOps principles and working in a DevOps environment
- Experience with Continuous Integration and Continuous Delivery (CI/CD) practices
- Strong communication skills and ability to work with global teams
It is a strong plus if you have:
- Experience in building and operating a cloud data platform
- Knowledge of data architecture and data marts
- Proficiency in Git, Shell scripting, and Python
- Ability to quickly learn and adapt to new technologies
- Experience collaborating with technical staff and project managers for efficient delivery
- Proactive approach to identifying improvement opportunities and solving issues
- Comfort in working in fast-paced, changing, and ambiguous environments
We offer you:
ITDS Business Consultants is involved in many various, innovative and professional IT projects for international companies in the financial industry in Europe. We offer an environment for professional, ambitious, and driven people. The offer includes:
- Stable and long-term cooperation with very good conditions
- Enhance your skills and develop your expertise in the financial industry
- Work on the most strategic projects available in the market
- Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years
- Participate in Social Events, training, and work in an international environment
- Access to attractive Medical Package
- Access to Multisport Program
- Access to Pluralsight
- Flexible hours & remote work
Internal job number #6763
You can report violations in accordance with ITDS’s Whistleblower Procedure available here.