Senior Data Engineer
Al. Jerozolimskie 158, Warszawa
Bayer Sp. z o.o.
At Bayer we’re visionaries, driven to solve the world’s toughest challenges and striving for a world where ,Health for all, Hunger for none’ is no longer a dream, but a real possibility. We’re doing it with energy, curiosity and sheer dedication, always learning from unique perspectives of those around us, expanding our thinking, growing our capabilities and redefining ‘impossible’. There are so many reasons to join us. If you’re hungry to build a varied and meaningful career in a community of brilliant and diverse minds to make a real difference, there’s only one choice.
We are looking for a Senior Data Engineer!
As a Senior Data Engineer, you will be a part of Bayer's Customer Engagement Team with varying assignments according to project needs. You will develop and operate robust, observable, and cost-efficient data pipelines and services powering analytics and data products on our Commercial Data Platform. Focus on first project will be the responsibility to replicate or reconnect the current infrastructure (ETLs, Local data warehouses and data visualization tools) used by the countries in EMEA region. The target scenario will be a global infrastructure cloud-based data platforms (AWS, Snowflake, Tableau..) combining information in two different levels:
Global pipelines and global data models
Local pipelines to enrich the global data models covering the necessities of the countries in terms of business.
YOUR TASKS AND RESPONSIBILITIES:
Collaborate closely within the data product team members, data source owners and data consumers (e.g. data scientists, data analysts) to understand data requirements and clarify acceptance criteria.
Contribute to the design of analytical and domain data models that support the organization's data & analytics requirements now and in the future.
Design, develop, and maintain scalable data pipelines for ingesting, transforming, and storing large volumes of data from diverse sources. Build and operate data solutions on AWS and in Snowflake or Databricks, applying modern lakehouse formats (Delta/Iceberg) where appropriate.
Desing and re-engineer manual data flows to enable scaling and repeatable use
Collaborate with cross-functional teams to gather requirements and translate them into effective visual solutions.
Follow documented standards and industry best practices for data engineering, ensuring compliance with regulations and data governance policies.
Design and implement data quality and integrity processes
Embed automated data quality checks and validation gates within pipelines and CI/CD.
Optimize data processing to improve performance and cost across storage and compute.
Participate in “data engineer on duty” rotations during regular office hours to support data services and incident response
Collaborate with cross-functional teams to gather requirements and translate them into effective visual solutions
Mentor junior engineers and uphold high standards in code review, testing, and documentation.
WHO YOU ARE:
Bachelor’s degree in Computer Science, Data Science, Information Technology, or a related field.
5+ years of experience in data engineering, data operations, or a similar role, delivering production-grade data pipelines and services.
Demonstrated expertise designing, implementing, and operating robust data ingestion, transformation, and serving pipelines across batch and streaming architectures, leveraging expert knowledge in SQL and Python, and experience in orchestration via modern platforms such as Dagster or Airflow with sound workflow/dependency design.
Good hands-on experience in leveraging solutions such as dbt Core to implement modular, tested, and well-documented data and analytics models; enforcing best practices for code quality, testing standards, and documentation throughout the data engineering lifecycle.
Good experience with data quality frameworks (e.g. dbt tests, Great Expectations, PyTest), integrating comprehensive data validation and quality gates within CI/CD pipelines, and supporting incident triage and root cause analysis.
Optional experience working with API such (GraphQL, OData, REST)
Competent in usage of data integration and ingestion solutions supported by Python libraries (e.g.dltHub, PyAirbyte)
Awareness of modern lakehouse table formats (Apache Iceberg, Delta Lake)
Optional of streaming integrations with Kafka and/or Kinesis, implementing efficient replay strategies for reliable near-real-time data delivery.
Good experience with cost and performance optimization and FinOps practices for cloud data systems such as Snowflake or Databricks, ensuring reliable operations and scalable data delivery in production environments.
Strong ownership of deliverables, high standards for code review and peer mentorship, and a commitment to clear documentation, metadata/lineage publishing, and sustainable engineering practices.
Familiarity working with agile development methodologies and tooling such as Azure DevOps
Excellent analytical and problem-solving skills.
Good communication and collaboration abilities. Ability to work independently and as part of a team.
Willingness and ability to learn and integrate new tools and technologies to enhance work efficiency and effectiveness
You feel you do not meet all criteria we are looking for? That doesn’t mean you aren’t the right fit for the role. Apply with confidence, we value potential over perfection!
WHAT DO WE OFFER:
A flexible, hybrid work model
Great workplace in a new modern office in Warsaw
Career development, 360° Feedback & Mentoring programme
Wide access to professional development tools, trainings, & conferences
Company Bonus & Reward Structure
Increased tax-deductible costs for authors of copyrighted works
VIP Medical Care Package (including Dental & Mental health)
Life & Travel Insurance
Pension plan
Co-financed sport card - FitProfit
Meals Subsidy in Office
Budget for Home Office Setup & Maintenance
Access to Company Game Room equipped with table tennis, soccer table, Sony PlayStation 5 and Xbox Series X consoles setup with premium game passes, and massage chairs
Tailored-made support in relocation to Warsaw when needed
Please send your CV in English
WORK LOCATION: WARSAW AL. JEROZOLIMSKIE 158
YOUR APPLICATION:
Bayer welcomes applications from all individuals, regardless of race, national origin, gender, age, physical characteristics, social origin, disability, union membership, religion, family status, pregnancy, sexual orientation, gender identity, gender expression or any unlawful criterion under applicable law. We are committed to treating all applicants fairly and avoiding discrimination.
Bayer is committed to providing access and reasonable accommodations in its application process for individuals with disabilities and encourages applicants with disabilities to request any needed accommodation(s) using the contact information below.
Bayer offers the possibility of working in a hybrid model. We know how important work-life balance is, so our employees can work from home, from the office or combine both work environments. The possibilities of using the hybrid model are each time discussed with the manager.Bayer respects and applies the Whistleblower Act in Poland.
Digital Hub Warsaw - here the best and most creative minds work in a diverse and inclusive environment on groundbreaking solutions that support Bayer's vision of "health for all - hunger for none." We create digital solutions that change the future.
Senior Data Engineer
Senior Data Engineer
Al. Jerozolimskie 158, Warszawa
Bayer Sp. z o.o.