llm-runner Simple recipe how run localai and anythingllm How to run: clone repo configure you setup in docker-compose.yml up containers Base UI: http://localhost:8080 - localai ui http://localhost:3001 - anythingllm ui