#1 Job Board for tech industry in Europe

Data Engineer
New
Data

Data Engineer

52 USD/hNet per hour - B2B
52 USD/hNet per hour - B2B
Type of work
Full-time
Experience
Senior
Employment Type
B2B
Operating mode
Remote

Tech stack

    English

    C1

    AWS

    advanced

    Google Cloud Platform

    advanced

    Hadoop

    advanced

    Cloudera

    advanced

    Azure

    advanced

    PySpark

    regular

    Java

    regular

    Python

    regular

    Scala

    regular

    Big Data

    regular

Job description

Information about the project:

Rate: depending on expectations

Location: Cracow - hybrid/remote

Industry: banking


We are seeking a Data Engineer to join our PSM Engineering team. The ideal candidate will be innovative and possess a strong desire for continuous improvement in engineering best practices. You will have deep technical expertise in various technologies and a passion for learning. Your experience in delivering software/technology projects using Agile methodologies is crucial. Candidates should demonstrate their contributions to critical business applications, ideally customer-facing, and effectively communicate complex ideas to non-expert audiences. Additionally, familiarity with emerging technologies in finance will be highly regarded.


Main Responsibilities:You will be responsible for creating data pipelines and supporting the data engineering lifecycle effectively. Key responsibilities include:• Develop and maintain robust data pipelines for data ingestion, transformation, and serving.• Apply modern software engineering principles to deliver clean, tested applications.• Collaborate with cross-functional teams to identify and solve engineering problems.• Migrate on-premise solutions to cloud ecosystems as required.• Utilize strong programming skills in Python and related technologies.• Ensure effective data modeling and schema design practices.• Manage CI/CD pipelines using tools like Jenkins and GitHub Actions.• Experiment with emerging technologies and methodologies in a fast-paced environment.


Key Requirements:• Extensive experience in the Data Engineering Lifecycle.• Strong proficiency in Hadoop and Cloudera.• Solid experience with AWS, Azure, or GCP, with a preference for GCP.• Proficient in Python and Pyspark, along with other languages like Scala/Java.• Familiar with Big Data technologies (Hadoop, HDFS, HIVE, Spark, etc.).• Knowledge in data lake formation and data warehousing principles.• Understanding of file formats such as Parquet, ORC, and Avro.• Experience with SQL and building data analytics.• Proven ability to use version control systems like Git.• Understanding of CI/CD principles.


Nice to Have:• Experience developing near real-time event streaming pipelines using Kafka or similar tools.• Familiarity with MLOps and maintaining ML models.• Understanding of NoSQL databases and their trade-offs compared to SQL.

52 USD/h

Net per hour - B2B

Apply for this job

File upload
Add document

Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Informujemy, że administratorem danych jest emagine z siedzibą w Warszawie, ul.Domaniewskiej 39A (dalej jako "administra... more