Sunscrapers is a technology consultancy that empowers finance and healthcare leaders to succeed by leveraging cutting-edge software, data, and AI.
We combine world-class engineering, deep industry expertise, and proprietary know-how to deliver innovative, high-impact solutions. Specializing in software engineering, DevOps, data engineering, and data science, we design and build AI-powered data platforms and web applications tailored to each client’s unique needs.
Trusted by over 60 clients across the US, UK, and beyond, we consistently maintain a 4.9/5 client satisfaction rating, with partnerships averaging five years or more.
The Project:
In the project, you’ll contribute to building a new holistic data platform for a US-based healthcare company. Leveraging AWS services, Airflow, and dbt, you’ll develop robust data pipelines and scalable architectures that enable comprehensive insight and operational efficiency.
The ideal candidate will be well-organized, eager to learn and adapt, and driven to tackle complex challenges. Most importantly, you’ll thrive in a collaborative, team-oriented environment!
Your responsibilities will include:
Building and orchestrating data flows for fetching, aggregation and data modeling using batch pipelines,
Integrating third-party systems and external data sources into data warehouse as well as reverse-ETL into third-party systems,
Modeling datasets and schemes for consistency and easy access,
Design and implement data transformations for optimized application use (and potential machine learning models),
Integrating data with business applications, at later stages of the project also machine learning model training, re-training and deployment.
What's important for us?
At least 5 years of professional experience as a data engineer,
Undergraduate or graduate degree in Computer Science, Engineering, Mathematics, or similar,
Excellent command in spoken and written English, at least C1,
Expertise on AWS stack,
Experience with infrastructure-as-code tools, like Terraform,
Hands-on experience with data integration both via direct API/database sources as well as with dedicated tool (Fivetran, Stitch and similar),
Strong professional experience with Python and SQL,
Hands-on experience with using and managing a data warehouse like Snowflake or Redshift.
DevOps skills to automate deployment and streamline development,
Creative problem-solving skills.
You will score extra points for:
Hands on experience with DBT,
Experience with AWS DMS, or other data migration or CDC services
Strong understanding of various data modeling techniques like Kimball Star Schema,
What do we offer?
Working alongside a talented team that’s changing the image of Poland abroad.
Flexible working hours and remote work possibility.
Comfortable office in central Warsaw equipped with all the necessary tools to conquer the universe (Macbook, external screen, ergonomic chairs).
Fully equipped kitchen with fruit, hot and cold drinks.
Multisport card & Private medical care.
Culture of good feedback: evaluation meetings, mentoring.
We value and appreciate our engineers eagerness to learn and improve so we strongly encourage and support their growth!
Net per month - B2B
Check similar offers