#1 Job Board for tech industry in Europe

  • Job offers
  • Data Engineer
    Data

    Data Engineer

    Warszawa
    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    B2B
    Operating mode
    Remote

    Tech stack

      SQL

      advanced

      Python

      advanced

      AWS

      regular

      PySpark

      regular

      Terraform

      regular

      Tableau

      nice to have

    Job description

    Online interview

    Job Title: Data Engineer (mid)

    Location: Remote


    About Arx 

    Arx is on a mission to catalyze the development of an equitably built world by empowering real estate professionals to instantly understand the regulatory and market forces impacting a region, drastically improving their ability to deliver housing where needed most.  

     

    Arx is building an AI-driven real estate analytics platform that automatically underwrites the future potential of millions of properties in advance, enabling builders & developers to source and evaluate optimal investment & development opportunities in seconds.


    The Role of Data in Arx 

    • Our product is fully data-driven, with processing outputs significantly influencing our clients' business operations. 
    • Your work will be impactful, with results visible through a fast feedback loop. 
    • We make the most of our data to provide viable information and, most importantly, to support in-house developed advanced analytics and machine learning models. 
    • We handle real and constantly evolving data, requiring robust methods for monitoring and improving its quality. 
    • This is a challenging yet rewarding process, as the data and how we use it prove to have tremendous value for our clients. 
    • Our data comes from multiple sources, comprising millions of records. Efficient extraction, transformation, and loading processes require distributed processing to ensure scalability. 


    Qualifications 

    • Minimum of 1 year of experience developing scalable data processing pipelines using PySpark in production environments. 
    • Proficient in SQL and Python programming. 
    • Skilled in ETL/ELT implementation in cloud-based environments, preferably AWS. 
    • Strong knowledge of data structures, OOP, algorithms, and performance-oriented design. 
    • Exposure to containers, microservices, distributed systems architecture, and cloud computing. 
    • Understanding of Infrastructure as Code (IaC), preferably using Terraform. 
    • Experience in data exploratory analysis and Tableau is a plus. 
    • Proficiency in English.


    Key Responsibilities 

    • Develop and maintain scalable data processing pipelines following industry standards. 
    • Understand current and new data sources and come up with optimal solutions. 
    • Monitor data changes and perform root-cause analysis when necessary. 
    • Collaborate closely with other technical teams and the Product Manager to understand requirements. 
    • Build embedded analytics using Tableau dashboards. 

     

    Benefits

    • Salary range: $35k - $45k /year (fixed USD monthly renumeration) in addition to equity compensation commensurate with experience. 
    • Flexible vacation policy, Arx observes Polish holidays.
    • Flexible working hours and remote work environment.
    • Professional development opportunities and continuous learning support.
    • Collaborative and inclusive work environment.

     

    Check similar offers

    Mid DevOps Data Engineer (+ Azure Data Factory)

    New
    1dea
    29 - 32 USD/h
    Gdańsk
    , Fully remote
    Fully remote
    Azure
    ETL
    ETL tools