Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vllm启动的模型接入后没有反应 #45

Open
masktone opened this issue Jan 21, 2025 · 6 comments
Open

vllm启动的模型接入后没有反应 #45

masktone opened this issue Jan 21, 2025 · 6 comments
Labels
bug Something isn't working

Comments

@masktone
Copy link

配置是否有问题

Image

@codexu
Copy link
Owner

codexu commented Jan 21, 2025

看了一下 vllm 文档,一个示例:curl http://localhost:8000/v1/chat/completions
按照示例,这里应填写 http://localhost:8000/v1

@masktone
Copy link
Author

看了一下 vllm 文档,一个示例:curl http://localhost:8000/v1/chat/completions 按照示例,这里应填写 http://localhost:8000/v1
尝试过了,还是不行,是因为他无法访问别的服务器的服务吗,模型启动是在服务器上,

@codexu
Copy link
Owner

codexu commented Jan 21, 2025

本地用 postman 调一下,把结果发一下。

@masktone
Copy link
Author

本地用 postman 调一下,把结果发一下。

Image
模型名可以忽略 我今天启动的是别的模型

@codexu
Copy link
Owner

codexu commented Jan 23, 2025

这看起来没什么问题,而且也不会存在跨域的问题,任何关于 AI 的功能都没反应吗?

@codexu
Copy link
Owner

codexu commented Jan 24, 2025

确实发现了这个问题,dev 模式下确实没有问题,build 后就会出现,我尝试了很多方式都无法修复。
我在 tauri issues 中找到了同样问题 https://github.com/tauri-apps/plugins-workspace/issues/1968,等待一下官方的回复吧。

@codexu codexu added the bug Something isn't working label Jan 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants