-
Notifications
You must be signed in to change notification settings - Fork 889
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support multiple tool call functions in remote vLLM inference provider #1120
Labels
enhancement
New feature or request
Comments
terrytangyuan
added a commit
that referenced
this issue
Feb 15, 2025
# What does this PR do? This fixes an issue when running the e2e agent example: https://github.com/meta-llama/llama-stack-apps/blob/main/examples/agents/e2e_loop_with_client_tools.py ``` | File "/home/yutang/repos/llama-stack/llama_stack/providers/remote/inference/vllm/vllm.py", line 175, in _process_vllm_chat_completion_stream_response | tool_call = convert_tool_call(choice.delta.tool_calls[0]) | File "/home/yutang/repos/llama-stack/llama_stack/providers/utils/inference/openai_compat.py", line 441, in convert_tool_call | return ToolCall( | File "/home/yutang/.conda/envs/distribution-myenv/lib/python3.10/site-packages/pydantic/main.py", line 214, in __init__ | validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) | pydantic_core._pydantic_core.ValidationError: 4 validation errors for ToolCall | call_id | Input should be a valid string [type=string_type, input_value=None, input_type=NoneType] | For further information visit https://errors.pydantic.dev/2.10/v/string_type | tool_name.enum[BuiltinTool] | Input should be 'brave_search', 'wolfram_alpha', 'photogen' or 'code_interpreter' [type=enum, input_value=None, input_type=NoneType] | For further information visit https://errors.pydantic.dev/2.10/v/enum | tool_name.str | Input should be a valid string [type=string_type, input_value=None, input_type=NoneType] | For further information visit https://errors.pydantic.dev/2.10/v/string_type | arguments | Input should be a valid dictionary [type=dict_type, input_value=202, input_type=int] | For further information visit https://errors.pydantic.dev/2.10/v/dict_type ``` This issue happened because not all arguments have been appended to the tool call buffer yet. The current code assumes that we are ready to convert the tool call whenever args can be converted to JSON successfully. In this case, `json.loads("202")` would succeed but the rest of the arguments have not been properly parsed yet. [//]: # (If resolving an issue, uncomment and update the line below) [//]: # (Closes #[issue-number]) ## Test Plan The e2e example worked successfully (although note that I ran the script twice with each function call separately due to #1120): ``` tool_execution> Tool:get_ticker_data Args:{'ticker_symbol': 'GOOG', 'start': '2023-01-01', 'end': '2023-12-31'} tool_execution> Tool:get_ticker_data Response:"[{\"('Year', '')\":2023,\"('Close', 'GOOG')\":140.4254455566}]" tool_execution> Tool:web_search Args:{'query': '42nd president of the United States'} tool_execution> Tool:web_search Response:"{\"query\": \"42nd president of the United States\", \"top_k\": [{\"title\": \"William J. Clinton | whitehouse.gov\", \"url\": \"https://obamawhitehouse.archives.gov/1600/presidents/williamjclinton\", \"description\": \"<strong>Bill Clinton</strong> is an American politician from Arkansas who served as the 42nd President of the United States (1993-2001). He took office at the end of the Cold War, and was the first baby-boomer generation President.\", \"type\": \"search_result\"}, {\"title\": \"Bill Clinton - Wikipedia\", \"url\": \"https://en.wikipedia.org/wiki/Bill_Clinton\", \"description\": \"<strong>William Jefferson Clinton</strong> (n\\u00e9 Blythe; born August 19, 1946) is an American politician and lawyer who served as the 42nd president of the United States from 1993 to 2001. A member of the Democratic Party, he previously served as the attorney general of Arkansas from 1977 to 1979 and as the ...\", \"type\": \"search_result\"}, [{\"type\": \"video_result\", \"url\": \"https://www.youtube.com/watch?v=eR2z_1-v87Y\", \"title\": \"A Conversation with Bill Clinton, 42nd President of the United ...\", \"description\": \"William Jefferson Clinton, the first Democratic president in six decades to be elected twice, led the United States to the longest economic expansion in Amer...\"}, {\"type\": \"video_result\", \"url\": \"https://www.facebook.com/clintoncenter/videos/january-20-1993-president-clinton-was-sworn-in-as-the-42nd-president-of-the-unit/448417409677375/\", \"title\": \"January 20, 1993, President Clinton was sworn in as the 42nd ...\", \"description\": \"WATCH: On January 20, 1993, President Bill Clinton was sworn in as the 42nd President of the United States. #InaugurationDay Video courtesy of the...\"}, {\"type\": \"video_result\", \"url\": \"https://www.youtube.com/watch?v=vI0HGQqEJh0\", \"title\": \"42nd President of the United States, Bill Clinton, shared thoughts ...\", \"description\": \"AboutPressCopyrightContact usCreatorsAdvertiseDevelopersTermsPrivacyPolicy & SafetyHow YouTube worksTest new features \\u00b7 \\u00a9 2024 Google LLC\"}, {\"type\": \"video_result\", \"url\": \"https://www.youtube.com/shorts/vI0HGQqEJh0\", \"title\": \"42nd President of the United States, Bill Clinton, shared ...\", \"description\": \"Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.\"}, {\"type\": \"video_result\", \"url\": \"https://www.youtube.com/watch?v=PHihhihVth0\", \"title\": \"Bill & Hillary Clinton returning to Little Rock for 20th ...\", \"description\": \"Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.\"}]]}" ``` All text inference tests passed. [//]: # (## Documentation) Signed-off-by: Yuan Tang <[email protected]>
@terrytangyuan I tried running https://github.com/meta-llama/llama-stack-apps/blob/main/examples/agents/e2e_loop_with_client_tools.py with Ollama backend and it works with multiple tools. I'm setting up a local vLLM server and connect llama-stack with it to test this with vLLM backend. I'll update the thread when I can reproduce the bug. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
🚀 Describe the new functionality needed
Currently, the remote vLLM inference provider only supports a single tool call function. For example, if you use this example: https://github.com/meta-llama/llama-stack-apps/blob/main/examples/agents/e2e_loop_with_client_tools.py, only the first function passed to
client_tools
argument inAgentConfig
will be used.💡 Why is this needed? What if we don't build it?
Users won't be able to use multiple tool call functions with agent.
Other thoughts
No response
The text was updated successfully, but these errors were encountered: