A modern desktop chat application built with Qt 6 for interacting with Large Language Models through Ollama.
- 🎨 Modern Material Design UI
- 💬 Chat interface with message history
- 🔄 Real-time streaming responses
- 🌙 Dark mode
- ⚡ High-performance C++ backend
- ⚙️ Customizable settings
- 🔍 Full-text search capabilities
- 🖥️ Cross-platform support (Windows, Linux)
- Qt 6.8.0 or higher
- CMake 3.27 or higher
- C++17 compatible compiler
- Ollama server running locally
- Clone the repository:
git clone https://github.com/FaZeRs/llm-chat.git
- Build the project:
cmake -E make_directory build && cd build
cmake ..
make
- Run the tests:
ctest --output-on-failure
- Install the application:
cmake --install .
The project includes a complete development environment using Dev Containers. To use it:
- Install Docker and VS Code with the Dev Containers extension
- Open the project in VS Code
- Click "Reopen in Container" when prompted
The container includes all necessary development tools:
- GCC 14 / Clang 18
- CMake
- Qt 6.8.0
- Code analysis tools (clang-tidy, cppcheck)
- Formatting tools (clang-format, cmake-format)
src/
- Source codecore/
- Core application functionalitychat/
- Chat backend and modelsquick/
- Qt Quick UI componentsqml/
- QML UI files
tests/
- Unit testscmake/
- CMake modules and configuration.github/
- CI/CD workflows
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Qt Framework
- Ollama
- Catch2 Testing Framework
- CPM.cmake
- Material Design
- Font Awesome Icons
Nauris Linde - @FaZeRs
Project Link: https://github.com/FaZeRs/llm-chat