SadServers
  • Scenarios
  • Dashboard
  • Solutions
    For Individuals For Businesses
  • Ranking
  • Newsletter
  • Documentation
    FAQ Pro Accounts Business Accounts Gift Support API Privacy Troubleshooting Interviews
  • Blog
  • Pricing
  • Gift
    Gift Purchase Gift Redeem
  • About
Log In - Sign Up

Ai Troubleshooting Scenarios

advent2025 ai apache bash c caddy clickhouse cron csv data processing disk volumes dns docker envoy etcd git golang gunicorn hack haproxy harbor hashicorp vault helm java jenkins json kubernetes linux-other mongodb mysql nginx node.js php podman postgres prometheus python rabbitmq redis sql sqlite ssh ssl supervisord systemd traefik
realistic / interviews new pro business

Ai

Scenarios related to AI
# Name Level Time Type
1 "London": Ollama LLM troubles Hard 20 m Fix New
"London": Ollama LLM troubles

Scenario: "London": Ollama LLM troubles

Level: Hard

Type: Fix

Access: Email

Description: An AI agent has been deployed to production as a container called ai-agent managed by the Docker Compose configuration /home/admin/app/docker-compose.yaml. This ai-agent container relies on an Ollama LLM backend to generate a report but hasn't generated any yet. Your mission is to restore the broken agent-to-LLM (Ollama) connectivity, and tune the agent configuration so it can produce a report in /home/admin/app/agent/report.json. Example of the expected output:

{
  "summary": "Nginx is failing to reach its upstream service",
  "root_causes": [
    {
      "service": "nginx",
      "error": "connection refused to upstream 127.0.0.1:9999",
      "severity": "high"
    }
  ],
  "recommended_actions": "Fix upstream port configuration"
}
Note: The system consist of a group of dummy nginx containers generating logs and sending them to a central rsyslog container. The logs are then shared on a volume with the ai-agent container, from there the agent picks up the logs and passes them together with a promt to the LLM server so it can produce the desired answer with the expected JSON format. You don't need to worry about troubleshooting any container other than the container ai-agent or service agent within docker compose.

Test: The command docker compose up -d agent under the directory /home/admin/app must create the report file /home/admin/app/agent/report.json. The format of the answer must be as specified in the description.

The "Check My Solution" button runs the script /home/admin/agent/check.sh, which you can see and execute.

Time to Solve: 20 minutes.

Send Us Feedback or Get Notified
For announcements like new scenarios. We'll never share your email with anyone else.
SadServersSadServers

Real-world Linux and DevOps scenarios for hands-on learning and technical assessment.

Uptime Robot ratio (30 days)
Product
  • Scenarios
  • For Individuals
  • For Businesses
  • Pricing
Resources
  • FAQ
  • Blog
  • Newsletter
Company
  • About Us
  • Support
  • Privacy Policy
  • Terms of Service
  • Contact
Connect With Us
info@sadservers.com

Made in Canada 🇨🇦
Updated: 2026-04-14 14:55 UTC – 1010d4d