#1 Job Board for tech industry in Europe

  • Job offers
  • Data Engineer (GCP)
    New
    Data

    Data Engineer (GCP)

    39 - 45 USD/hNet per hour - B2B
    39 - 45 USD/hNet per hour - B2B
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Remote

    Tech stack

      English

      B2

      Polish

      C2

      Python

      advanced

      Airflow

      regular

      Airbyte

      regular

      GCP

      regular

      Terraform

      regular

      Kubernetes

      regular

      SQL

      regular

    Job description

    Online interview

    We are a leading consulting firm that excels in automating business processes through advanced artificial intelligence solutions. We empower IT specialists to elevate their careers within a thriving B2B environment. Our diverse portfolio includes impactful projects across fintech, medtech, edutech, and industrial automation sectors. Through our strong collaboration with clients, we drive their growth by delivering swift and measurable results, while equipping IT professionals with unparalleled opportunities for career advancement and valuable experience.


    Why is it worth working with us?


    ✨ Flexibility - You choose projects tailored to your skills and interests, with the possibility of changing within our company or the Euvic group, of which we are a part.


    💡 Transparency - You have clear cooperation rules and full remuneration and conditions transparency.


    🚀 Speed ​​of action - Thanks to our efficiency, you will quickly find or change a project that perfectly suits your competencies.


    🌟 Development opportunity - You work on innovative projects, developing key competencies and gaining valuable experience.


    With us, you are sure of cooperation at the highest level and the opportunity to develop ambitious projects tailored to your competencies.


    Job Responsibilities:

    • Creating and Maintaining Unit Tests in Flutter Applications:
    • Utilize the flutter_test package to write unit tests, ensuring high code quality and reliability.
    • Testing State Management with BLoC (Cubit):
    • Use the bloc_test package to test state management logic within the app effectively.
    • Implement tests for BLoC or Cubit patterns used in the project.
    • Mocking and Simulating Services for Testing:
    • Create service mocks using the mocktail or mockito packages to simulate specific behaviors in tests.
    • Test components that interact with external services, ensuring they behave as expected in different scenarios.
    • Writing Widget Tests:
    • Leverage the flutter_test package to create widget tests, ensuring correct UI rendering and interactions with UI components.
    • Performing Integration Testing:
    • Write integration tests using the built-in integration_test package and extend this with the patrol package for advanced testing features.
    • Test the cooperation of different application components in realistic user conditions.
    • Collaborating with the Development Team:
    • Work closely with the development team to integrate tests into the CI/CD pipeline and ensure continuous code quality.



    The Client operates within the insurance technology space and offers a streamlined online platform that simplifies the process of purchasing life insurance and related financial products. It allows users to compare quotes from multiple insurance providers, making it easier for individuals to find policies that fit their needs and budgets. The client focuses on transparency and user-friendliness, often providing educational resources to help potential customers better understand their options.


    The customer wanted to integrate insurance and call center data (size in terabytes) from multiple warehouses while addressing tech debt.


    Goal: Establish a single source of truth for high-quality, maintainable data.


    Implemented Solution

    - Merged data from various warehouses into a unified system.

    - Optimized and automated data ingestion processes with Airflow and Airbyte.

    - Added new data sources to enhance reporting and analytics capabilities.

    - Refactored existing code and processes to improve system performance and maintainability.

    - Developed a mechanism to analyze caller sentiment using the integrated data.



    39 - 45 USD/h

    Net per hour - B2B

    Check similar offers

    Data Engineer with RoR

    New
    Acaisoft
    6.62K - 7.94K USD/month
    Warszawa
    , Fully remote
    Fully remote
    PostreSQL
    MS SQL
    Python

    Senior Data Engineer (Scala | Java + Spark + Airflow + HDFS)

    New
    1dea
    45 - 52 USD/h
    Szczecin
    , Fully remote
    Fully remote
    Team Player
    Apache Spark
    Scala

    Senior Data Engineer (pharma)

    New
    7N
    37 - 41 USD/day
    Warszawa
    , Fully remote
    Fully remote
    SQL
    Talend
    Data Vault

    Staff Frontend Engineer

    New
    Adverity
    79.3K - 96.3K USD/year
    Katowice
    , Fully remote
    Fully remote
    TypeScript
    API
    Cypress

    Senior Data Engineer (Databricks)

    New
    Addepto
    5.56K - 8.45K USD/month
    Warszawa
    , Fully remote
    Fully remote
    SQL
    Python
    Big Data

    Check similar offers

    Data Engineer with RoR

    New
    Acaisoft
    6.62K - 7.94K USD/month
    Warszawa
    , Fully remote
    Fully remote
    PostreSQL
    MS SQL
    Python

    Senior Data Engineer (Scala | Java + Spark + Airflow + HDFS)

    New
    1dea
    45 - 52 USD/h
    Szczecin
    , Fully remote
    Fully remote
    Team Player
    Apache Spark
    Scala

    Senior Data Engineer (pharma)

    New
    7N
    37 - 41 USD/day
    Warszawa
    , Fully remote
    Fully remote
    SQL
    Talend
    Data Vault

    Staff Frontend Engineer

    New
    Adverity
    79.3K - 96.3K USD/year
    Katowice
    , Fully remote
    Fully remote
    TypeScript
    API
    Cypress

    Senior Data Engineer (Databricks)

    New
    Addepto
    5.56K - 8.45K USD/month
    Warszawa
    , Fully remote
    Fully remote
    SQL
    Python
    Big Data