All offersWarszawaPythonData Engineer
Data Engineer
Python
PwC Polska

Data Engineer

PwC Polska
Warszawa
Type of work
Undetermined
Experience
Mid
Employment Type
Permanent, B2B
Operating mode
Office
PwC Polska

PwC Polska

W PwC projekty innowacyjne i technologiczne to nasza codzienność. Jako firma konsultingowa mamy tę przewagę, że możemy współpracować z klientami z różnych branż na całym świecie. W Polsce zatrudniamy ponad 6000 osób, w tym 2000 w zespołach #Tech.

Company profile

Tech stack

    Hadoop
    regular
    MS Azure
    regular
    GCP/AWS
    regular
    Python
    regular
    Scala
    regular
    Git
    regular

Job description

Online interview
PwC is a powerful network of over 250.000 people across 158 countries. All committed to deliver quality in Assurance, Tax, Advisory & Technology services. Match your curiosity with continuous opportunities to learn, grow and make an impact. Be who you are and be a game changer.

Data Engineer

We are looking for candidates who are eager to work in the field of data engineering in a Big Data environment. Your main role would be to join a long-term project for PwC US and work on developing and maintaining a data lake with consumer data with over 50 data sources and ca. 40TB total volume. Additionally, you will have a chance to join local initiatives or projects focusing on data engineering or data analytics.

Our team:
In PwC Data Analytics we aim at delivering end-to-end solutions by collaborating heavily with Business Experts from other teams in PwC (Strategy & Operations, Financial Services, Digital Transformation, etc.), Developers and professionals from IT. Our team employs specialists in Machine Learning, Big Data solutions and architecture, statistics and its applications, as well as Deep Learning with various backgrounds (computer science, mathematics, physics, engineering and economy) and degree of seniority (from juniors with 1-2 years of experience to country leaders with over 10+ years on the market).

Responsibilities:
  • Gathering requirements regarding data products and building a backlog of features;
  • Designing, implementing and monitoring cloud-based ETLs;
  • Regular communication with stakeholders from PwC US;
  • Evaluation of new data vendors;
  • Supporting local initiatives as an SME;
  • Writing documentation.

Candidate’s profile:
  • At least 3 years of professional experience;
  • Working experience with Hadoop ecosystem (Spark, Hive, YARN) or MS Azure (Blob storage, Databricks) or GCP/AWS alternatives;
  • Communication skills;
  • Min B2 in English, written and spoken;
  • Good knowledge of Python or Scala;
  • Practical knowledge of version control in git.

Additional assets will be:
  • Experience with processing large volumes of data;
  • Familiarity with Linux and bash scripting;
  • Fair knowledge of algorithms and data structures.

We offer:
  • Working in an international team;
  • Flexible working hours and possible home office;
  • A broad offer of technical trainings and conferences;
  • Subsidized language courses;
  • Regular team building initiatives, including hackathons, parties and away-days;
  • Dynamic, project driven work environment;
  • Excellent working conditions and friendly working atmosphere;
  • Attractive compensation with additional benefits package.