We are looking for a Senior PySpark Developer. You will lead the design and implementation of large-scale data processing systems, providing strategic insights that drive business success.
Hybrid model, with 2-3 days per week working from the client's office in Cracow or Wrocław.
Duties
- Design scalable and resilient data architectures on Azure, leveraging services like Azure Data Lake Storage, Azure SQL Database, and Azure Data Factory
- Evaluate and implement new tools and technologies to enhance data processing capabilities
- Lead the development and optimization of complex data pipelines using PySpark and Azure Databricks
- Provide technical guidance and mentorship to junior developers, ensuring best practices and high-quality code
- Identify and drive opportunities for process improvement and automation
- Stay abreast of industry trends and emerging technologies to ensure the organization remains competitive in big data and cloud solutions
Requirements
- At least 3 years of experience in big data processing and cloud technologies
- Advanced proficiency in PySpark, Python, and SQL
- Extensive experience with Azure cloud services, including Azure Databricks, Data Lake, and Azure Data Factory
- Proven experience in designing and implementing large-scale data architectures
- Strong leadership and mentoring abilities with a track record of leading successful data projects
Offer
- We gather like-minded people:
- Engineering community of industry professionals
- Chance to work abroad for up to 60 days annually
- Relocation within our 50+ offices
- We provide growth opportunities:
- Outstanding career roadmap
- Leadership development, career advising, soft skills, and well-being programs
- Certification (GCP, Azure, AWS)
- Unlimited access to LinkedIn Learning, Get Abstract, O’Reilly, Cloud Guru
- Language classes in English and Polish for foreigners