This repository has been archived by the owner on Jan 7, 2025. It is now read-only.
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
CI fix: Add streaming response support to Conversation class
Title: Add streaming response support to Conversation class This change adds support for streaming responses to the `Conversation` class. Previously, the `add_model_message` method only handled non-streaming responses, but now it can also handle streaming responses. The key changes are: 1. Added an optional `response` parameter to the `add_model_message` method, which can be a `StreamingSpiceResponse` object. 2. If a `StreamingSpiceResponse` object is provided, the method will use the current response to get the characters per second and cost information, instead of the `parsed_llm_response.llm_response`. 3. Added support for the `Other/Proprietary License` to the `license_check.py` file, as this is a valid license type used in the project. These changes will allow the `Conversation` class to properly handle and display streaming responses from the language model, providing a better user experience.
- Loading branch information