Regular Data Engineer GCP
Location and work model: Warsaw – hybrid work model at the client’s office
Rate: 115–135 PLN/hour net (B2B contract)
Start date: ASAP
Engagement: Long-term
Form of cooperation: B2B, full-time
Working hours: Standard business hours
About the Client:
Our client is one of the leading telecommunications operators in Poland, providing mobile services, internet access, and modern digital solutions. The company is part of an international group and consistently invests in innovation, digital transformation, and technological development to maintain a strong market position.
Project Overview:
The project involves migrating a Data Warehouse from Oracle to the Google Cloud Platform (GCP) environment. The key objective is to build a modern cloud-based data architecture while ensuring high data quality, scalability, and security.
Requirements:
Commercial experience working with GCP,
Minimum 3 years of experience as a Data Analyst, Data Quality Analyst, or in a similar role within a data-driven organization,
Experience in Data Quality and Data Governance,
Very good knowledge of SQL and PL/SQL,
Strong proficiency in Python,
Experience working with Linux systems and writing Bash scripts,
Familiarity with the Cloudera Hadoop ecosystem (Apache Spark, Apache Kafka),
Knowledge of ETL processes and real-time data processing.
Nice to Have:
Experience with CI/CD and automation tools,
Knowledge of Scala,
Experience in metadata management and data quality frameworks,
Experience working with various data sources (Kafka, MQ, SFTP, databases, APIs, file shares),
Participation in international projects,
Ability to translate technical concepts into business language,
Independence, attention to detail, and a proactive mindset,
English proficiency at a minimum B2 level.
Responsibilities:
Building and maintaining Data Lake ingestion processes from multiple data sources,
Designing, developing, and optimizing complex data pipelines (batch and real-time),
Creating and enhancing frameworks supporting data pipeline development and maintenance,
Implementing comprehensive tests for data processing workflows,
Collaborating with analysts and data scientists to ensure high data quality,
Ensuring Data Governance standards, security, and regulatory compliance,
Evaluating and implementing new technologies to improve performance and stability,
Integrating data from systems such as Kafka, MQ, SFTP, databases, APIs, and file storage systems.
Technologies Used in the Project:
Cloudera Hadoop stack (Apache Kafka, Apache Spark),
Google Cloud Platform (GCP),
SQL / PL/SQL, Python, Scala,
Linux, Bash, ETL,
SFTP, MQ, APIs, file shares.
We Offer:
Up to 135 PLN/hour net on a B2B contract,
Flexible payment method,
Short 14-day invoice payment term,
Comprehensive private healthcare package,
Access to the MyBenefit cafeteria platform (including Multisport cards and prepaid cards for IKEA, Zalando, Notino, and many other retailers).
Regular Data Engineer GCP
Regular Data Engineer GCP