DevOps Engineer II DataOps Engineer
center, Wiedeń
Otterly AI
Otterly.ai is a fast-moving startup created on a mission to shape the future of digital marketing. We’re pioneers in Generative Engine Optimization (GEO) helping brands boost visibility across AI search engines like ChatGPT, Google AI Overviews, Perplexity.AI and others. As the first AI search monitoring platform, we support marketing teams in navigating this new AI-driven search era.
About the Role:
We are looking for a Data Acquisition + DevOps Engineer to manage and optimize our critical data crawling infrastructure that powers our AI search monitoring platform. This role is essential to our operations, combining reactive incident response with proactive system improvements. You will be responsible for the reliability and quality of data acquisition from various AI search engines, in a dynamic environment where quick problem-solving is key.
Extra: We value engineers who use AI tools like GitHub Copilot or Cursor to boost their productivity:)
Your responsibilities in details :
1. Data Acquisition & Crawling
Monitor and react to issues in crawling systems (incident response, error queues)
Keep high data quality - check links, content completeness, and formats
Integrate new AI search engines (design and build crawlers)
Build scalable crawler architecture that adapts to source changesInvestigate failures and prevent them from happening again
2. DevOps & AWS Infrastructure
Design and deploy AWS infrastructure (CloudFormation, CDK)
Build serverless applications with AWS Lambda
Configure SQS, DynamoDB, API Gateway, CloudFront, Route53
Optimize AWS costs and system performance
3. CI/CD & Automation
Create and maintain CI/CD pipelines (GitHub Actions)
Use Infrastructure as Code (IaC)
Build automated tests for crawlers and data validation
Set up monitoring and alerts (CloudWatch and other AWS tools)
Main requirements:
Min. 4 years of experience in data engineering, DevOps roles
2+ years working with large-scale crawling systems
Strong skills in Node.js and TypeScript (backend)
Experience with AWS (Lambda, DynamoDB, SQS, CloudFormation, CDK)
Knowledge of serverless and event-driven architecture
Experience with CI/CD (GitHub Actions or similar)
Nice to have:
Knowledge of anti-bot detection and crawling JavaScript-rendered content
Experience with API Gateway, CloudFront, Route53
Database optimization for large datasets
Understanding of compliance and ethical crawling practices
What Otterly.ai offers:
Remote Work: Enjoy a fully remote position within the EU timezone, giving you flexibility and work-life balance
Salary Range: We offer a salary of 70.000-95.000 EUR/Year
Professional Growth: You'll have plenty of opportunities for continuous learning and career advancement;
Innovative Environment: Work on impactful projects with a team that truly values creativity and innovation;
Supportive Culture: Join a collaborative and inclusive workplace where your contributions are valued.
Recruitment Process:
Video Call: If your CV aligns with our requirements, HR invites you for an initial video call.
Task: The next step involves a task for you to complete.
Final Interview & Task Discussion: In the final stage, you'll meet key founders. This will be an opportunity to discuss your task and have a broader conversation.
Offer: If all goes well, we'll extend an offer, and you can start your journey with Otterly.ai
DevOps Engineer II DataOps Engineer
DevOps Engineer II DataOps Engineer
center, Wiedeń
Otterly AI