Skip to content

Releases: ParisNeo/lollms-webui

v9.0

27 Jan 21:10
Compare
Choose a tag to compare

Code fully moved to fastAPI

New bindings. Faster than ever.
Better binding installation parameters with fixed versions.

Created new install method for hugging face, exllamav2 and python llama cpp.
Added conda library so that we can install more complex stuff from lollms directly.

New multi tools paradigms to solve libraries versions problems and incompatibility between them.
Added ollama client and server
Added vllm client and server
Added petals client and server

Full compatibility with open ai API allowing the user to use any client application with lollms, which basically means that you can use for example google gimini binding and use lollms to route it to autogen or some other open ai compatible API, just configure it to use the lollms server instead.

New lollms generation interface that allows you to build your own apps using raw generation or persona augmented generation through lollms.

An unreal engine plugin is gonna be released to give life to your lollms characters

v8.5

04 Jan 03:04
Compare
Choose a tag to compare

Moved to python 3.11
New hardware management system
Upgraded bindings
New services : sd + xtts
Loras support in hugging face and python_llama_cpp bindings

LoLLMS v8.0

16 Dec 15:09
Compare
Choose a tag to compare
  • Changed the name
  • Added more interactive options
  • Added loads of personas
  • Added many new Bindings
  • Added many new models

V7.0

15 Nov 20:24
550c143
Compare
Choose a tag to compare
  • Enhanced personality functionalities
  • Reorganized structure
  • AWQ support
  • Multimodality support
  • Many new personas
  • Interactive view

v6.7

15 Oct 23:46
Compare
Choose a tag to compare
  • Added long term memory
  • Fixed some bugs
  • Exllama 2 support
  • Open AI support
  • First Extensions implementation (still in beta)

Note:

Homebrew is required on macos

v6.5.0

21 Sep 21:40
Compare
Choose a tag to compare

The first stable release of 6.5.
Petals integration for distributed text generation
A special windows wsl version of lollms for windows users.
For linux/macos, just follow the instructions from the README.md page of the project.

v6.5 RC2

17 Sep 00:37
Compare
Choose a tag to compare

Release candidate 2
Added leaderboard link to the zoo
Added Aristotle to philosophers
Multiple bugfixes
GPT4ALL supports Vulkan for AMD users

Added lollms with petals to use a decentralized text generation on windows over wsl.

v6.5 RC1

15 Sep 00:31
Compare
Choose a tag to compare

A release candidate for lollms V6.5.
For windows use the installer
For other platforms use the scripts from scripts folder in the repository.
https://github.com/ParisNeo/lollms-webui/tree/main/scripts

v6.3

27 Aug 20:08
Compare
Choose a tag to compare
  • New UI engine
  • Added lollms playground presets system
  • Enhanced prompting
  • Better intrgration with stable diffusion
  • Enhanced speed for bindings

V6.0(beta)

19 Aug 00:34
Compare
Choose a tag to compare
  • New personalities structure
  • New localization engine
  • Incorportated Playground into the main app
  • Multiple enhancements in personalities