#1 Job Board for tech industry in Europe

  • Job offers
  • Data Engineer
    New
    Data

    Data Engineer

    Warszawa
    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    Permanent
    Operating mode
    Hybrid
    Procter & Gamble

    Procter & Gamble

    Procter & Gamble is the place where business & technology come together, and where you know you are making an impact right from #PGDAY1.Our Central Europe Technology Hub, located in Poland, is the second largest IT Hub in P&G globally with 500+ IT Specialists. Our IT Teams play a fundamental role in building better understanding of consumers and shoppers, while working on global category leading brands. We collect unique insights and support better decision making to solve different business challenges.

    Company profile

    Tech stack

      R

      regular

      ETL

      regular

      Data

      regular

      Spark

      regular

      Hadoop

      regular

      Java

      regular

      SQL

      regular

      Python

      regular

      Scala

      regular

      NoSQL

      regular

    Job description

    Online interview
    Friendly offer

    Are you passionate about leveraging information technology solutions to drive better business value? Are you ready to take on the challenge of managing applications at a global scale? If so, we have an exciting opportunity for you.


    We are seeking a talented Data Engineer to lead our data engineering, data governance, and data quality activities. In this role, you will play a crucial role in designing, developing, and implementing cloud-based data and analytics platforms. You will be responsible for creating and maintaining data pipelines that acquire, cleanse, transform, and publish data, ensuring it meets both functional and non-functional business requirements supporting consumer-facing applications, including brand websites and Direct-to-Consumer selling platforms for our renowned Grooming brands (Braun, Gillette, Venus, The Art of Shaving) around the world.


    Responsibilities:

    • Understand the need of Business/Operations teams
    • Design, develop, and implement data and analytics platforms (DAP) and pipelines
    • Create and optimize data models to support business intelligence and analytics needs
    • Develop and manage Extract, Transform, Load (ETL) processes to ensure data integrity and quality
    • Administer and optimize databases, ensuring efficient storage and retrieval of data
    • Integrate data from various sources, including APIs, databases, and third-party services
    • Monitor and optimize data processing performance to ensure timely data availability
    • Work closely with data scientists, analysts, data assets managers, architects, and stakeholders to understand data requirements and provide fit for use solutions
    • Utilize coding standards and best practices for efficient and reusable services and components
    • Implement and optimize data engineering best practices, including query optimization, version control, and code reviews
    • Utilize end-user visualization tools like Power BI for data presentation
    • Implement data governance policies to ensure data security, compliance, and quality standards
    • Maintain comprehensive documentation for data architecture, processes, and workflows
    • Build strong relationships and work collaboratively within multidisciplinary teams
    • Stay updated on emerging technologies and tools to enhance data engineering practices and infrastructure


    Qualifications/Experiences:

    • Bachelor's degree in Computer Science, Information Systems, or a related field, a Master’s degree is a plus
    • Proficiency in programming languages such as Python, Java, R or Scala
    • Strong understanding of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra).
    • Experience with ETL tools and frameworks (e.g., Apache NiFi, Talend, Informatica).
    • Knowledge of big data technologies (e.g., Hadoop, Spark, Kafka).
    • Experience with cloud services (e.g. Google Cloud Platform, Azure, AWS) for data storage and processing.
    • Hands-on knowledge of PowerBI and GitHub Actions.
    • Understanding of data modeling techniques and best practices
    • Familiarity with data engineering best practices, including query optimization and version control.
    • Knowledge of modern application development frameworks and tools
    • Experience with end-user visualization tools like Power BI.
    • Strong problem-solving abilities, communication skills, and a collaborative mindset
    • Excellent communication skills and the ability to work effectively in diverse, multidisciplinary teams


    We offer

    • P&G-sized projects and access to world leading IT partners and technologies from Day 1
    • Wide range of self-development possibilities (training and certifications paths)
    • Competitive starting salary and benefits program (private health care, P&G stock, saving plans, sport cards) 
    • Regular salary increases and possible promotions - in line with your results and performance
    • Opportunity to change role every few years to be in the best place for you and best for P&G 


    Watch this video to learn more about our full recruiting process: https://www.youtube.com/watch?v=0bicvbpy0gI

    Kindly be advised that at P&G, employment is exclusively extended on the basis of an "Umowa o Pracę" (Full-time Employment Contract). Apply only if you agree to these conditions.

    Undisclosed Salary

    Permanent

    Check similar offers

    PL/SQL Developer (Sektor ubezpieczeń na życie)

    New
    1dea
    4.4K - 6.6K USD
    Warszawa
    , Fully remote
    Fully remote
    PL/SQL
    Oracle
    Oracle APEX

    Mid Data Engineer

    New
    CLOUDFIDE
    3.28K - 5.75K USD
    Warszawa
    , Fully remote
    Fully remote
    Microsoft Azure Cloud
    Databricks
    Data Lake

    Data Scientist (Gen AI + Python)

    New
    IN Team
    5.47K - 7.82K USD
    Warszawa
    , Fully remote
    Fully remote
    AI
    Python
    SQL

    Data Engineer

    New
    Experis Manpower Group
    4.69K - 6.6K USD
    Warszawa
    , Fully remote
    Fully remote
    GCP
    PySpark
    Spark clusters

    Mid T-SQL Developer

    New
    Acaisoft
    3.42K - 4.15K USD
    Warszawa
    , Fully remote
    Fully remote
    T-SQL