Experimental Agents broken with custom endpoints #4303
Replies: 5 comments
-
I also didn't see this when I first made the issue, but someone on the discord is also experiencing the issue. So it's not an isolated incident with my f*cked up config at least. |
Beta Was this translation helpful? Give feedback.
-
I will confirm that I had the issue. I use custom and litellm as a backend. |
Beta Was this translation helpful? Give feedback.
-
This is expected since functionality with custom endpoints is not currently supported. I have not advertised this feature to be used. If you find it great, you're welcome to try it out and report issues, I may even start a dedicated discussion for this once I make a proper announcement for users to test this. |
Beta Was this translation helpful? Give feedback.
-
I also had this issue, here is a more relevant screenshot where I ran it on the dev endpoint. Note that it called my endpoint successfully, it just throws this in the end. |
Beta Was this translation helpful? Give feedback.
-
I made this PR #4324 as a fix. |
Beta Was this translation helpful? Give feedback.
-
What happened?
Note: This is only tested with OpenRouter, this may not apply to other custom endpoints. Unfortunately I do not have the resources to try with something else.
When choosing a custom endpoint in the agent creation menu, and then choosing a model (any model from what I found), exiting out will lead to this error.
I have no concrete theory as to what is causing this. But I assume something on the front end is passing a broken input to the
chatCompletion
function atapi/server/controllers/agents/client.js
since the error appears in the logs.Again, this theory is mostly baseless. I unfortunately do not have the time to go more in depth, as I wish to rewatch The Orville at this moment, not make another issue.
Steps to Reproduce
What browsers are you seeing the problem on?
No response
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions