You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Llama Stack Version: 0.1.3
OS: macOS
Python: 3.10
Deployment: Podman
Reproducible: Yes, occurs every time a search query is made.
Information
The official example scripts
My own modified scripts
🐛 Describe the bug
Issue Summary
Even when Tavily Search (tavily-search) is explicitly configured as the primary web search provider in AgentConfig, the inference response still displays "brave_search.call()" in the output.
Key Observation (so far):
If the Tavily API key is removed, the system throws an error, confirming that Tavily Search is actually being used in the backend.
Despite this, the inference output still returns "brave_search.call(...)", suggesting that "brave_search" is hardcoded somewhere in the response formatting.
Error logs
for official example: Brave search appeared in the 2.2 websearch agent example in the getting_started doc, but Travily search api key is given at the very beginning.
same happen to my custom [websearch code] (https://github.com/anyuzoey/llama_stack_starter/blob/main/tool_websearchtest.py)
if i comment of line 27 about provider api provider_data = {"tavily_search_api_key": tavily_search_api_key} it will show that there is a missing api key. (please ignore some intermediate debugging output in the middle start with 🔍🛠️ Warning)
if i provide this api key, brave search tool is called according to inference output.
Ways to reproduce:
Prerequisite
Ollama
Podman
Steps
1. Start Ollama
ollama run llama3.2:3b-instruct-fp16 --keepalive 60m
Note that the "type" of the tool is still "brave_search" since Llama models have been trained with brave search as a builtin tool. Tavily is just being used in lieu of Brave search.
System Info
Llama Stack Version: 0.1.3
OS: macOS
Python: 3.10
Deployment: Podman
Reproducible: Yes, occurs every time a search query is made.
Information
🐛 Describe the bug
Issue Summary
Even when Tavily Search (tavily-search) is explicitly configured as the primary web search provider in AgentConfig, the inference response still displays "brave_search.call()" in the output.
Key Observation (so far):
If the Tavily API key is removed, the system throws an error, confirming that Tavily Search is actually being used in the backend.
Despite this, the inference output still returns "brave_search.call(...)", suggesting that "brave_search" is hardcoded somewhere in the response formatting.
Error logs
if i comment of line 27 about provider api
provider_data = {"tavily_search_api_key": tavily_search_api_key}
it will show that there is a missing api key. (please ignore some intermediate debugging output in the middle start with 🔍🛠️ Warning)if i provide this api key, brave search tool is called according to inference output.
Ways to reproduce:
Prerequisite
Ollama
Podman
Steps
1. Start Ollama
ollama run llama3.2:3b-instruct-fp16 --keepalive 60m
2. Start up the Llama Stack server
open terminal write
then start server by run
3. installing the llama stack client cli and sdk
run following
next testing if it is configed properly
expect output
next run
llama-stack-client models list
expect to see a list of models you have
4. Run the web search Agent
run
python tool_websearch.py
Expected behavior
I expect if we specific travily search tool, the inference output should clearly display the correct tool called.
The text was updated successfully, but these errors were encountered: