Replies: 1 comment
-
@jasonpnnl I saw your pull request #444 where you make use of usage/stats. I'm interested in how you extract this info from your messages. The only way I found out, using Ollama models locally, is to extract them using an Are there better ways to obtain them? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Can Pipelines, using Ollama locally, return the messages' stats like with the models directly handled by Open WebUI (see image below) in the metadata or another field of the body, or is there a way to tell Open WebUI to compute them? Thanks☺️
Beta Was this translation helpful? Give feedback.
All reactions