Databricks Engineer
Konstruktorska 11, Warszawa
The Codest
🌍 Hello World!
We are The Codest - International Tech Software Company with tech hubs in Poland delivering global IT solutions and projects. Our core values lie in “Customers and People First” approach that prioritises the needs of our customers and a collaborative environment for our employees, enabling us to deliver exceptional products and services.
Our expertise centers on web development, cloud engineering, DevOps and quality. After many years of developing our own product - Yieldbird, which was honored as a laureate of the prestigious Top25 Deloitte awards, we arrived at our mission: to help tech companies build impactful product and scale their IT teams through boosting IT delivery performance. Through our extensive experience with product development challenges, we have become experts in building digital products and scaling IT teams.
But our journey does not end here - we want to continue our growth. If you’re goal-driven and looking for new opportunities, join our team! What awaits you is an enriching and collaborative environment that fosters your growth at every step.
We are currently looking for:
DATABRICKS DATA ENGINEER
📈 Your Responsibilities:
Here, you will have an opportunity to contribute to a banking app for one of the leading financial groups in Japan. The platform is equipped with bank modules and data management features and it is customer-facing as well. We are seeking an experienced Databricks Engineer to design, build, and manage scalable data solutions and pipelines using Databricks. You’ll work closely with cross-functional teams to ensure data is reliable, accessible, and efficient to power analytics and business intelligence initiatives.
Architect medallion architecture (Bronze, Silver, Gold) lakehouses with optimized performance patterns
Build strong data quality frameworks with automated testing and monitoring
Implement advanced Delta Lake features such as time travel, vacuum operations, and Z-ordering
Develop and maintain complex ETL/ELT pipelines processing large-scale datasets daily
Design and implement CI/CD workflows for data pipelines using Databricks Asset Bundles or equivalent tools
Create real-time and batch data processing solutions with Structured Streaming and Delta Live Tables
Optimize Spark jobs for cost efficiency and performance, leveraging cluster auto-scaling and resource management
Develop custom integrations with Databricks APIs and external systems
Design scalable data architectures using Unity Catalog, Delta Lake, and Apache Spark
Establish data mesh architectures with governance and lineage tracking
🔑 Key Requirements:
10+ years of experience in data engineering, with a strong track record of designing and deploying production-grade data pipelines and ETL/ELT workflows
Databricks Certified Professional Data Engineer (or equivalent certification)
Proficiency in Python, PySpark, and SQL
Extensive hands-on experience with AWS services such as Glue, Lambda, Redshift, and S3
Solid background in data governance and management tools, including platforms like Unity Catalog and AWS Sagemaker Unified Studio
Proven experience in data migration initiatives and modernizing legacy systems
Communicative English language skills (the project is in an international team)
➕ Nice to have:
Excellent communication and teamwork skills
Strong analytical thinking and problem-solving ability
Experience with containerization and orchestration tools (e.g., Docker, Kubernetes) is considered an advantage
📜 Our Promise (what you can expect from us):
24000-30000 PLN on B2B contract
100% remote work (but we have offices in Krakow and Warsaw and we’re happy to meet there from time to time 😉)
300 PLN to use on our benefits platform, Worksmile - gift cards, medical services, sports, etc.
Our B2B contract contains provisions that allow you to obtain IP BOX support
Integration events, education opportunities and much more…
A unique opportunity to take your career to the next level - we’re looking for people who want to create an impact. You have ideas, we want to hear them!
📌 Recruitment process:
30 minute screening call online with our recruiter
45min-1h technical call with one of our engineers
1h call with the team leader
Offer
Questions, insights? Feel free to reach out to our recruiting team:
In the meantime, feel free to visit our website where you can find key facts about us.