Replies: 1 comment 1 reply
-
Which LLM do you want to add support for? To add a new LLM you'll need to add a new provider package here https://github.com/run-llama/LlamaIndexTS/tree/main/packages/providers |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the bug
Trying to implement CustomLLM , with different apiKey and baseUrl.
Below you can see an example in Python.
https://docs.llamaindex.ai/en/stable/api_reference/llms/custom_llm/
To Reproduce
No need
Expected behavior
Option to use a custom LLM like in Python version
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
Additional context
Add any other context about the problem here.
Beta Was this translation helpful? Give feedback.
All reactions