Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug Report: Llama Stack Defaults to Brave Search Even When Tavily Search Is Explicitly Configured #1229

Open
2 tasks done
anyuzoey opened this issue Feb 24, 2025 · 3 comments
Assignees
Labels
bug Something isn't working

Comments

@anyuzoey
Copy link

anyuzoey commented Feb 24, 2025

System Info

Llama Stack Version: 0.1.3
OS: macOS
Python: 3.10
Deployment: Podman
Reproducible: Yes, occurs every time a search query is made.

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

Issue Summary

Even when Tavily Search (tavily-search) is explicitly configured as the primary web search provider in AgentConfig, the inference response still displays "brave_search.call()" in the output.

Key Observation (so far):

If the Tavily API key is removed, the system throws an error, confirming that Tavily Search is actually being used in the backend.
Despite this, the inference output still returns "brave_search.call(...)", suggesting that "brave_search" is hardcoded somewhere in the response formatting.

Error logs

  1. for official example: Brave search appeared in the 2.2 websearch agent example in the getting_started doc, but Travily search api key is given at the very beginning.
  2. same happen to my custom [websearch code] (https://github.com/anyuzoey/llama_stack_starter/blob/main/tool_websearchtest.py)
    if i comment of line 27 about provider api provider_data = {"tavily_search_api_key": tavily_search_api_key} it will show that there is a missing api key. (please ignore some intermediate debugging output in the middle start with 🔍🛠️ Warning)
    Image
    if i provide this api key, brave search tool is called according to inference output.
    Image

Ways to reproduce:

Prerequisite

Ollama
Podman

Steps

1. Start Ollama

ollama run llama3.2:3b-instruct-fp16 --keepalive 60m

2. Start up the Llama Stack server

open terminal write

export INFERENCE_MODEL="meta-llama/Llama-3.2-3B-Instruct"
export LLAMA_STACK_PORT=8321
mkdir -p ~/.llama

then start server by run

podman run --privileged -it \
  -p $LLAMA_STACK_PORT:$LLAMA_STACK_PORT \
  -v ~/.llama:/root/.llama \
  llamastack/distribution-ollama \
  --port $LLAMA_STACK_PORT \
  --env INFERENCE_MODEL=$INFERENCE_MODEL \
  --env OLLAMA_URL=http://host.docker.internal:11434
3. installing the llama stack client cli and sdk

run following

yes | conda create -n stack-client python=3.10
conda activate stack-client

pip install llama-stack-client
pip install python-dotenv

next testing if it is configed properly

llama-stack-client configure --endpoint http://localhost:$LLAMA_STACK_PORT

expect output

> Enter the API key (leave empty if no key is needed):
Done! You can now use the Llama Stack Client CLI with endpoint http://localhost:8321

next run llama-stack-client models list
expect to see a list of models you have

4. Run the web search Agent

run python tool_websearch.py

Expected behavior

I expect if we specific travily search tool, the inference output should clearly display the correct tool called.

@anyuzoey anyuzoey added the bug Something isn't working label Feb 24, 2025
@anyuzoey
Copy link
Author

Can this issue assigned to me. I'm figuring this out now.

@terrytangyuan
Copy link
Collaborator

Sure!

@Shreyanand
Copy link

This excerpt maybe useful for this issue, from getting_started.ipynb section 2.2.

Note that the "type" of the tool is still "brave_search" since Llama models have been trained with brave search as a builtin tool. Tavily is just being used in lieu of Brave search.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants