Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
sestinj committed Mar 17, 2024
2 parents d85d68a + 590648f commit 8d0e734
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 1 deletion.
1 change: 1 addition & 0 deletions docs/docs/model-setup/select-provider.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ You can run a model on your local computer using:
- [Ollama](../reference/Model%20Providers/ollama.md)
- [LM Studio](../reference/Model%20Providers/lmstudio.md)
- [Llama.cpp](../reference/Model%20Providers/llamacpp.md)
- [KoboldCpp](../reference/Model%20Providers/openai.md) (OpenAI compatible server)
- [llamafile](../reference/Model%20Providers/llamafile) ((OpenAI compatible server)
- [LocalAI](../reference/Model%20Providers/openai.md) (OpenAI compatible server)
- [Text generation web UI](../reference/Model%20Providers/openai.md) (OpenAI compatible server)
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/reference/Model Providers/openai.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ The easiest way to find this information is from the chat playground in the Azur
### OpenAI compatible servers / APIs

OpenAI compatible servers

- [KoboldCpp](https://github.com/lostruins/koboldcpp)
- [text-gen-webui](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai#setup--installation)
- [FastChat](https://github.com/lm-sys/FastChat/blob/main/docs/openai_api.md)
- [LocalAI](https://localai.io/basics/getting_started/)
Expand Down

0 comments on commit 8d0e734

Please sign in to comment.