Senior AI/Python Developer
Sienna 39, Warszawa
Directio
Directio is a global IT services company. We consult, code, test, deploy, and manage mainly cloud-based and mobile applications, providing around-the-clock support from our offices in Poland, the Philippines, Mexico, and the USA. We prepare our FMCG, retail, automotive, and SaaS clients for the future by accelerating their digital transformation. Operating under the “We Code Success” principle, we prioritize the success of our clients, consultants, and partners.
About project:
We are looking for an AI/Python Developer for our Swedish client who will participate in developing a platform for collecting employee and customer feedback. Our client helps other companies make informed decisions based on data to obtain useful information.
Responsibility:
As an AI/Python Developer, you will be you will be designing, developing, and deploying smart, scalable, and production-ready AI-powered services using Python, ensuring optimal performance, maintainability, and adaptability to evolving business needs;
You will be leveraging cutting-edge AI tools such as Claude, Cursor, GitHub Copilot, and other emerging technologies to accelerate development workflows, enhance code quality, and drive innovation;
You will be architecting and optimizing hybrid retrieval systems to support LLM-based agents, large-scale knowledge bases, and enterprise data lake exploration, focusing on speed, relevance, and scalability;
You will be collaborating closely with the core product development team—working in a tech stack that includes React and Node.js to seamlessly integrate AI capabilities into end-user experiences;
You will be leading new AI initiatives from concept to deployment—rapidly prototyping solutions, conducting feasibility testing, refining designs, and driving continuous improvements in alignment with product and business objectives;
You will be utilizing Azure cloud infrastructure to deploy, monitor, and manage AI workloads, applying best practices for scalability, cost-efficiency, and security;
You will be staying ahead of the rapidly evolving LLM/AI ecosystem, continuously evaluating new frameworks, models, and approaches to keep the organization at the forefront of AI adoption;
You will be developing advanced AI agents using LangGraph, MCP, gRPC, and a combination of proprietary and open-source models, ensuring robust performance in complex environments;
You will be building scalable and resilient data services using Kafka, RabbitMQ, and a variety of ELT platforms to support high-throughput, real-time data processing;
You will be implementing CI/CD workflows with GitHub Actions, Azure services, ArgoCD, and Argo Workflows, working closely with the SRE team to ensure smooth deployments and operational stability.
Requirements:
5+ years of professional experience with Python, demonstrating deep expertise in designing, building, and optimizing backend services, automation scripts, and AI-driven applications;
Proven experience working with Generative AI tools such as Claude, Cursor, GitHub Copilot, or similar, with the ability to integrate them effectively into real-world solutions;
Strong background in modern data retrieval architectures, including hands-on experience with vector databases, embedding stores, and hybrid search systems;
Expertise in indexing, querying, and retrieving data from large-scale data lakes, with an emphasis on optimizing relevance and efficiency for LLMs and AI agents;
Solid understanding of APIs, microservices, and distributed backend architecture for building scalable, fault-tolerant systems;
Hands-on experience deploying and managing solutions in Azure or equivalent cloud platforms;
Excellent English communication skills, both written and verbal, with the ability to clearly convey complex concepts to technical and non-technical audiences;
Collaborative and team-oriented mindset, with a focus on collective success and a willingness to share knowledge.
Nice to have:
Knowledge of NLP technologies such as Word2Vec, CBOW, LSTM, ML, RNNs, NLTK, and related AI/ML frameworks;
Experience with data processing frameworks like Apache Flink, Apache Spark, and Superset, as well as ELT/ETL tools such as Airflow or Meltano;
Strong working knowledge of PostgreSQL, including query optimization and advanced data modeling techniques.
We offer:
Salary for work amounting to 25 000 PLN - 28 000 PLN + VAT on B2B;
Professional trainings;
Flexible working conditions;
Private healthcare and Multisport card.
Please be advised that we will only contact selected candidates