Loading video player...
What if your observability stack could actually investigate your production issues? In this demo, I built a fully local AI agent that: • Runs inside Docker • Uses a local LLM • Calls tools dynamically • Queries Prometheus metrics • Detects P95 latency spikes • Suggests possible root causes No cloud APIs. No OpenAI. No external calls. Just local AI + real telemetry. This is what AI-powered observability looks like when it’s practical. If you’re into DevOps, SRE, AI agents, or local LLM workflows — this is for you. #ai #devops #observability #llm #docker #prometheus #techshorts #shorts