"London": Ollama LLM troubles
Scenario: "London": Ollama LLM troubles
Level: Hard
Type: Fix
Tags: ai
Access: Email
Description: An AI agent has been deployed to production as a container called ai-agent managed by the Docker Compose configuration /home/admin/app/docker-compose.yaml. This ai-agent container relies on an Ollama LLM backend to generate a report but hasn't generated any yet. Your mission is to restore the broken agent-to-LLM (Ollama) connectivity, and tune the agent configuration so it can produce a report in /home/admin/app/agent/report.json. Example of the expected output:
{
"summary": "Nginx is failing to reach its upstream service",
"root_causes": [
{
"service": "nginx",
"error": "connection refused to upstream 127.0.0.1:9999",
"severity": "high"
}
],
"recommended_actions": "Fix upstream port configuration"
}
Note: The system consist of a group of dummy nginx containers generating logs and sending them to a central rsyslog container. The logs are then shared on a volume with the ai-agent container, from there the agent picks up the logs and passes them together with a promt to the LLM server so it can produce the desired answer with the expected JSON format. You don't need to worry about troubleshooting any container other than the container ai-agent or service agent within docker compose.
Root (sudo) Access: True
Test: The command docker compose up -d agent under the directory /home/admin/app must create the report file /home/admin/app/agent/report.json. The format of the answer must be as specified in the description.
The "Check My Solution" button runs the script /home/admin/agent/check.sh, which you can see and execute.
Time to Solve: 20 minutes.