At AICONIC we believe that radical transparency and algorithmic decision making is the future. We’re passionate about building software that solves problems. We partner with the most important companies and institutions in the world to transform how they use data and technology. If you're seeking a career where you can truly make a difference in the lives of others, a career where you can work at the absolute forefront of technology with the top minds in the field, you'll find it here. We work with great clients like Pfizer, Johnson & Johnson, UEFA and many others.
This job requires a high level of task ownership. No tickets to be done. The role is broad. Includes: communicating with Product Owner on requirements and mapping requirements to tasks (including prioritizing and determining feasibility), developing the solution, liaising with all members of the development team (who may or may not have AI experience).
Forward Deployed AI Engineers work directly with customers owning Gen AI strategy and implementation. On a daily basis, you will build end-to-end workflows, take them to production, and solve real world problems at the largest scale. You will have ample opportunity to contribute learnings from the field back to the AICONIC A1 Platform.
You will be on the forefront of extending AICONIC's existing footprint and strategy into new markets and problem spaces opened up by Gen AI.
- The ideal candidate is an experienced individual, who enjoys optimizing data systems and building them from the ground up.
- Will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
- The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.
- Design and build data pipelines to support data science projects following software engineering best practices
- Use state of the art technologies to acquire, ingest and transform big datasets
- Create and manage data environments in the cloud or on premise
- Ensure information security standards are maintained at all time
- Create AI-driven automations for internal purposes
- Engineering mindset, focused on delivering production solutions with Gen AI, data processing pipelines, and advanced analytics tools.
- Solving real business problems, not academic benchmarks.
- Ability to collaborate efficiently in teams of technical and non-technical individuals.
- Comfort working in a dynamic environment with evolving objectives and direct iteration with users.
- Work from anywhere (home, office, etc.)
- Starting from 15.000 - 30.000 PLN/month +VAT on a B2B contract
- 26 days of paid holidays + an equivalent of public holidays in Poland
- Additional benefits tailored to your needs (We do not follow a one-size fits all approach)
- Startup atmosphere, less documentation and rules, we are open to experimenting and learning
- Never alone in a project - always together 2 people at least
- Very good place for people who are passionate about data
- Python - at least 4 years of production experience
- Ability to develop APIs in FastAPI.
- Use of best practices: testing, code formatters, typing, Pydantic
- Experience in production use of LLM models
- Experience with Langchain, Langgraph, RAG
- Experience with Langsmith or other LLM-based system performance monitoring tool, independently creating evaluation pipelines. Ability to train other team members
- Experience in building RAG-based systems, understanding of vector database applications
- Experience in all phases of software development: from prototyping solutions in the R&D phase to creating a production solution
- Basic knowledge of Streamlit (or other frontend library that allows rapid prototyping)
- Knowledge of CI/CD processes
- Practical knowledge of Docker/ Kubernetes
- Experience with cloud technologies, especially with services that allow you to host containers (in particular Amazon ECS, Azure Container Instances, Google Cloud)
- Experience with Infrastructure as a Code (Terraform)
- Basic understanding of the typical ML pipeline
- Fluent written and verbal communication in English
- At least bachelor degree in STEM; preferably computer science
- Home assignment
- Screening call
- Tech Interview