#1 Job Board for tech industry in Europe

  • Job offers
  • GCP Data Engineer (business trips to Saudi Arabia)
    Data

    GCP Data Engineer (business trips to Saudi Arabia)

    Rijad
    42 - 48 USD/hNet per hour - B2B
    42 - 48 USD/hNet per hour - B2B
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Hybrid

    Tech stack

      Polish

      C2

      English

      C1

      BigQuery

      advanced

      Google Cloud

      advanced

      SQL

      advanced

      Cloud Fusion

      regular

      Python

      regular

    Job description

    Online interview

    Hi there! If you’re looking for a high-impact position in an ambitious software house, we’ve got a match for you!


    We are looking for a Data Engineer to join us in the groundbreaking project in Western Asia, aiming to build cognitive, hyperconnected cities and next-gen infrastructure from scratch. Work with our client on pioneering the development of a sustainable and cutting-edge mega and smart city.

    Our goal is to deliver a data management platform based on GCP-managed services to produce sociodemographic statistics.


    *Candidates should be open to travel to our client (Asian continent). This role requires approximately on-site presence, aligned with the GMT+3 time zone. In practice, onsite rotations have typically followed a flexible pattern, such as 3 weeks onsite followed by 5 weeks remote.


    Your main responsibilities for this position will be:


    • Design, build, and maintain data pipelines across ingestion, storage, processing, and analytics layers.
    • Leverage Google Cloud Fusion, BigQuery, Cloud Storage, and Cloud Composer for ETL/ELT workflows.
    • Implement data validation and monitoring using tools like Great Expectations and logging/monitoring.
    • Analyze and organize raw data. 
    • Build data systems and pipelines.
    • Conduct complex data analysis and report on results. 
    • Prepare data for prescriptive and predictive modeling.



    This offer will be a perfect match for you if you have:


    • At least 5 years of practice in related roles using Google Cloud Platform.
    • Proven experience with Google Cloud BigQuery for data warehouse, Google Cloud Storage for data lakes, and Google Cloud Spanner for processing data sets.
    • Strong skills in Python/SQL, Apache Airflow (via Cloud Composer).
    • Experience with streaming and batch ingestion methods, using GCP-native tools or custom scripts.
    • Familiarity with BigQuery optimization, partitioning, and federated queries
    • Used Google Cloud Fusion and Datahub for metadata and pipeline management.
    • Practice with data lake and warehouse modeling.
    • Technical expertise with data models, data mining, and segmentation techniques.




    It is worth joining us because of:

    • Flexibility - working hours are flexible just like the work mode, you can work remotely or in a hybrid model from our modern office in Warsaw.
    • Great atmosphere - we value a friendly, informal atmosphere, and direct contact with everyone in the company.
    • Outstanding People - we understand that great teams are about personalities, not just skills. Therefore our team accommodates a fantastic blend of individuals and management that removes roadblocks.
    • Modern technologies - we use proven technologies that are currently up-to-date. Even if you have not used all of them, you can make up for it with us!
    • Unlimited possibilities - you’ll get the opportunity to develop your qualifications thanks to sponsorship for the industry's meetups and conferences as well as working on challenging international projects with the latest technologies. 
    • Private medical care and Multisport - we care about your health and wellbeing so you’ll get access to private medical care for you and your family, and partial funding for a sports card.
    42 - 48 USD/h

    Net per hour - B2B

    Check similar offers

    Specjalista IDMS (Mainframe)

    New
    4IT Solutions
    43 - 48 USD/h
    Warszawa
    , Fully remote
    Fully remote
    IDMS in Mainframe
    DB2