Client: pharmaceutical industry
Hourly Rate: up to 175 PLN + VAT
Work arrangement: 100% remote, full-time
For Client from BigPharma sector we are looking for Senior Data Engineer.
Right now, we are looking for two Senior Data engineers to join our successful Haystack platform, where we model financial date for valuable reporting.
We live in Azure, with Databricks and Fabric being our main tools of trade, but experience and capabilities in working with Python, AAS and SQL/Sequel are highly appreciated.
You could be an expert in backend or frontend design and engineering, or maybe a hybrid, even covering Purview/PowerApps or LogAnalytics? If you have strong capabilities in these areas, let’s hear from you!
The position is based in Warsaw, Poland and at our site in Ørestad, which is in Geater Copenhagen, Denmark. You will be working remotely in a global environment. Nevertheless, you must expect that, core meetings or workshops occasionally take place in Danish or Polish office.
You will be part of the Global Data & Artificial Intelligence area, in the Data Engineering department of Data Management. Data Management is globally distributed and has for mission to harness the power of Data and Artificial Intelligence, integrating it seamlessly into the fabric of organization's operations. We provide the foundation to integrate the areas of Data and Artificial Intelligence throughout the whole organization, empowering organization to realize its strategic ambitions to create a connected data landscape. We drive the innovation, process definition, and platform implementation in the areas of data governance, data engineering design, master data management, metadata management, data marketplace, and data quality aligned with business priorities.
The organisation values flexibility in ways of working to support various life situations. Employees are recognised for their unique qualities and skills, and the environment fosters development and collaboration. The broader mission includes improving the lives of millions of patients globally through innovation and dedication to chronic disease care.
There is a commitment to becoming not just the best company in the world, but the best company for the world. This vision can only be achieved through the contributions of talented employees with diverse backgrounds and perspectives. An inclusive culture is fostered that celebrates diversity across employees, patients, and communities.
Responsibilities:
Dedicated engineering prioritized by the team Product Manager.
Support the DK based core team in developing new product/models.
Take lead in tasks given by Product Manager, aligning with stakeholders.
Plan and execute your work in Azure DevOps.
Document all work to be transparent for the rest of the team.
Ensure the delivery of high-quality work. Trust in data is our most valuable currency.
Collaborate with cross-functional teams to understand and address technical requirements in addition to implementing efficiency improvements.
Requirements:
A master’s degree in data, Computer Science, Information Management, or a related field.
Experience of 5+ years in the domain of software and data engineering, with relevant years in Data Pipeline Engineering and Integration.
Strong proficiency in Databricks and/or Fabric and a deep understanding of its core functions and tools to optimize data workflows, date modelling and analytics.
Extensive experience with programming languages such as Python, especially for data manipulation and automation.
Proficient in SQL for complex query development, and data extraction across varied databases.
Hands-on experience with DevOps, ensuring smooth and reliable software delivery and code deployments..
Expertise in data modelling, data integration, metadata management, and data governance to establish robust and scalable data architectures.
Strong experience of 4+ years with Cloud Technologies, such as Azure and AWS. Certifications are a plus.
Net per month - B2B
Check similar offers