#1 Job Board for tech industry in Europe

  • Job offers
  • Senior Data Engineer (Python, Databricks)
    New
    Data

    Senior Data Engineer (Python, Databricks)

    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Remote
    Capco Poland

    Capco Poland

    Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry by leading the implementation, automation, and innovation of IT solutions for international clients.

    Company profile

    Tech stack

      Databricks

      advanced

      SQL

      advanced

      AWS

      regular

    Job description

    Online interview
    Friendly offer

    CAPCO POLAND 

    *We are looking for Poland based candidate. The job is remote.


    Joining Capco means joining an organisation that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can and we believe that everyone brings something different to the table – so we’d love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.


    Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry. We are passionate about helping our clients succeed in an ever-changing industry.

    We also are experts in CAPCO  P  O L A N D  

    * focused on development, automation, innovation, and long-term projects in financial services. In Capco, you can code, write, create, and live at your maximum capabilities without getting dull, tired, or foggy.

     

    THINGS YOU WILL DO

    • Design, develop, and implement robust data architecture solutions utilizing modern day data platforms like Databricks.
    • Ensure scalable, reliable, and secure data environments that meet business requirements and support advanced analytics.
    • Lead the migration of data from traditional RDBMS systems to Databricks environments.
    • Architect and design scalable data pipelines and infrastructure to support the organization's data needs.
    • Develop and manage ETL processes using Databricks to ensure efficient data extraction, transformation, and loading.
    • Optimize ETL workflows to enhance performance and maintain data integrity.
    • Ensure seamless data transition with minimal disruption to ongoing operations.
    • Monitor and optimize performance of data systems to ensure reliability, scalability, and cost-effectiveness within Databricks environments.
    • Collaborate with cross-functional teams, including data engineers, data scientists, analysts, and product managers, to understand data requirements and deliver solutions.
    • Define best practices and standards for data engineering/data warehouse processes and ensure adherence to them.
    • Evaluate and implement new technologies and tools to improve the efficiency and effectiveness of data pipelines.
    • Provide technical leadership and mentorship to junior members of the data engineering team.
    • Work closely with DevOps and infrastructure teams to deploy and manage data systems in production environments.
    • Ensure that all data solutions comply with organizational policies, industry standards, and regulatory requirements.
    • Collaborate with enterprise architects and IT leadership to ensure alignment of data architecture with overall IT architecture and strategies

     

    TECH STACK: Python, Databricks, SQL, AWS/Azure/GCP, Docker, Kubernetes

     

    SKILLS & EXPERIENCES YOU NEED TO GET THE JOB DONE

    • Extensive experience with Databricks, including ETL processes and data migration.
    • Experience with additional cloud platforms like AWS, Azure, or GCP.
    • Knowledge of big data technologies and frameworks.
    • Strong knowledge of data warehousing concepts, data modeling, and SQL
    • Experience with RDBMS systems and transitioning data to modern cloud platforms.
    • Proficiency in programming languages such as Python, SQL, and scripting languages.
    • Knowledge of data governance frameworks, data quality management practices, and data security principles
    • Proficiency in database technologies, including relational databases e.g., SQL Server. In-depth knowledge of data modeling, database design.
    • Familiarity with containerization technologies such as Docker and orchestration tools like Kubernetes.
    • Excellent problem-solving and analytical skills. Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams
    • Bachelor or Master Degree in Computer Science or related field


    NICE TO HAVE

    • Certifications in Databricks


    WHY JOIN CAPCO?

    • Employment contract and/or Business to Business - whichever you prefer
    • Possibility to work remotely
    • Speaking English on daily basis, mainly in contact with foreign stakeholders and peers
    • Multiple employee benefits packages (MyBenefit Cafeteria, private medical care, life-insurance)
    • Access to 3.000+ Business Courses Platform (Udemy)
    • Access to required IT equipment
    • Paid Referral Program
    • Participation in charity events e.g. Szlachetna Paczka
    • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
    • Being part of the core squad focused on the growth of the Polish business unit
    • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
    • A work culture focused on innovation and creating lasting value for our clients and employees

     

    ONLINE RECRUITMENT PROCESS STEPS*

    • Screening call with the Recruiter
    • Technical interview
    • Client Interview
    • Feedback/Offer

    Check similar offers

    Cloud Data Engineer

    New
    DCG
    5.6K - 6.94K USD
    Białystok
    , Fully remote
    Fully remote
    ETL
    AWS
    SQL

    Senior Software Engineer (Java, GCP & Data)

    New
    DNA Technology
    3.89K - 5.6K USD
    Łódź
    , Fully remote
    Fully remote
    GCP
    Java
    Data

    Azure DataBricks Engineer

    New
    CRODU
    5.72K - 7.77K USD
    Wrocław
    , Fully remote
    Fully remote
    Oracle
    Databricks
    SQL

    RPA/Power Apps Developer

    New
    BlueSoft
    4.87K - 6.33K USD
    Kraków
    , Fully remote
    Fully remote
    Power Platform
    Power Automate
    Power Apps

    Senior Data Engineer

    New
    Avenga
    5.72K - 6.54K USD
    Łódź
    , Fully remote
    Fully remote
    Azure
    Python
    PySpark