in4ge sp. z o.o.
Łączymy specjalistów IT z projektami, które mają sens. Współpracujemy z firmami technologicznie świadomymi i budujemy zespoły, które naprawdę działają. Stawiamy na #RightPeople.
We are looking for an experienced Data Engineer to join the Data & Analytics team. The ideal candidate will have deep expertise in data engineering with a focus on building Graph Databases using Neo4j, as well as hands-on experience in data ingestion, optimization techniques and Google Cloud Platform (GCP) services.
Responsibilities
Design, develop, and optimize Graph Database solutions using Neo4j.
Build and maintain robust data pipelines for large-scale data ingestion and processing on GCP.
Design and optimize data models, schemas, and queries to ensure high performance and scalability.
Develop, maintain, and optimize data workflows using GCP data services such as BigQuery, Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).
Implement best practices for data warehousing and data lake architectures.
Collaborate with cross-functional teams to translate business requirements into technical solutions.
Ensure data integrity, quality, and security in all data engineering processes.
Contribute to the continuous improvement of engineering standards, tools, and processes.
Requirements
4+ years of hands-on experience as a Data Engineer, with a minimum of 2 years working specifically with GCP data services.
Proven experience building and optimizing Graph Databases with Neo4j.
Strong proficiency in SQL with expertise in schema design, query optimization, and handling large datasets.
Advanced skills in BigQuery, including complex SQL, partitioning, clustering, and performance tuning.
Hands-on experience with at least one GCP data processing service: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).
Proficiency in at least one programming or scripting language such as Python, Java, or Scala.
Solid understanding of data warehousing and data lake concepts and best practices.
Experience with version control systems such as Git.
English proficiency at C1 level (both written and spoken).
Professional Data Engineer (PDE) certification or a similar relevant certification.
Nice to Have
Familiarity with data governance and data security best practices.
Experience working in Agile teams and methodologies.
Knowledge of machine learning pipelines and integration with data platforms.
We offer
Fully remote work with flexible working hours.
Long-term collaboration on B2B contract.
Opportunity to work on complex cloud projects for international clients.
Professional growth in a highly skilled and supportive team.
Collaborative and open working culture.
💡 Don’t miss out on tailored opportunities!
We have many ongoing recruitments, and new projects are constantly coming in. By giving your consent to process your data for future recruitment processes, we’ll be able to invite you to roles that match your experience and expectations!
PS: We’ll only reach out to you when we have projects that might genuinely interest you — without your consent, we won’t be able to do that.
Our recruitment process is transparent and focused on finding the right candidate for our clients. When you apply, you can count on our objectivity, respect, and full professionalism.
We look forward to receiving your CV.
We connect you with the right people.
B2B
Check similar offers