Releases: ParisNeo/lollms-webui
v9.0
Code fully moved to fastAPI
New bindings. Faster than ever.
Better binding installation parameters with fixed versions.
Created new install method for hugging face, exllamav2 and python llama cpp.
Added conda library so that we can install more complex stuff from lollms directly.
New multi tools paradigms to solve libraries versions problems and incompatibility between them.
Added ollama client and server
Added vllm client and server
Added petals client and server
Full compatibility with open ai API allowing the user to use any client application with lollms, which basically means that you can use for example google gimini binding and use lollms to route it to autogen or some other open ai compatible API, just configure it to use the lollms server instead.
New lollms generation interface that allows you to build your own apps using raw generation or persona augmented generation through lollms.
An unreal engine plugin is gonna be released to give life to your lollms characters
v8.5
LoLLMS v8.0
- Changed the name
- Added more interactive options
- Added loads of personas
- Added many new Bindings
- Added many new models
V7.0
v6.7
v6.5.0
v6.5 RC2
v6.5 RC1
A release candidate for lollms V6.5.
For windows use the installer
For other platforms use the scripts from scripts folder in the repository.
https://github.com/ParisNeo/lollms-webui/tree/main/scripts
v6.3
V6.0(beta)
- New personalities structure
- New localization engine
- Incorportated Playground into the main app
- Multiple enhancements in personalities