Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server defaults to port 8321 ignoring configured server port from yaml #1076

Open
2 tasks
thoraxe opened this issue Feb 13, 2025 · 4 comments · May be fixed by #1105
Open
2 tasks

server defaults to port 8321 ignoring configured server port from yaml #1076

thoraxe opened this issue Feb 13, 2025 · 4 comments · May be fixed by #1105
Labels
bug Something isn't working

Comments

@thoraxe
Copy link

thoraxe commented Feb 13, 2025

System Info

N/A

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

I did a build with venv and modified config.yaml to specify the server port:

...
server:
  port: 5001

When running, if a port argument isn't specified, the port is automatically forcibly set by default, an the configured port is ignored:

INFERENCE_MODEL=meta-llama/Llama-3.2-1B-Instruct VLLM_URL=http://192.168.1.252:4000/v1 VLLM_API_TOKEN=sk-1234 llama stack run --image-type venv ~/.llama/distributions/remote-vllm/remote-vllm-run.yaml 
Using run configuration: /home/thoraxe/.llama/distributions/remote-vllm/remote-vllm-run.yaml
+ python -m llama_stack.distribution.server.server --yaml-config /home/thoraxe/.llama/distributions/remote-vllm/remote-vllm-run.yaml --port 8321
Using config file: /home/thoraxe/.llama/distributions/remote-vllm/remote-vllm-run.yaml
...
...
INFO:     Uvicorn running on http://['::', '0.0.0.0']:8321 (Press CTRL+C to quit)

Error logs

n/a

Expected behavior

server should use the configured port from the yaml and not forcibly set one ignoring the config.

@thoraxe thoraxe added the bug Something isn't working label Feb 13, 2025
@leseb
Copy link
Contributor

leseb commented Feb 13, 2025

At first glance, it seems to me that the CLI flag takes precedence over the config file, this is a common behavior. At least that's what you example shows.

Did I miss something?

@thoraxe
Copy link
Author

thoraxe commented Feb 13, 2025

@leseb in the absence of specifying a CLI flag, llama stack run sets its own CLI flag, so the config yaml will ALWAYS be ignored.

As you can see from my copy/paste, I did not set a CLI flag, but run set one for me, even though I didn't set one.

So perhaps the subject of the issue is not quite accurate -- perhaps it should be:

llama stack run sets a default CLI flag for port which will always override the config

leseb added a commit to leseb/llama-stack that referenced this issue Feb 14, 2025
Ensure that the options follow the correct priority order:

1. Environment variable (eg: `LLAMA_STACK_PORT`) takes the highest precedence.
2. CLI flag (`--port`) is considered only if the env variable is unset.
3. Config file value (`config.server.port`) is used as a fallback.
4. Default port (`8321`) is the last resort.

Closes: meta-llama#1076
Signed-off-by: Sébastien Han <[email protected]>
@leseb leseb linked a pull request Feb 14, 2025 that will close this issue
@leseb
Copy link
Contributor

leseb commented Feb 14, 2025

@leseb in the absence of specifying a CLI flag, llama stack run sets its own CLI flag, so the config yaml will ALWAYS be ignored.

As you can see from my copy/paste, I did not set a CLI flag, but run set one for me, even though I didn't set one.

So perhaps the subject of the issue is not quite accurate -- perhaps it should be:

llama stack run sets a default CLI flag for port which will always override the config

Oh right thanks for clarifying, I could repro and sent a patch!

@thoraxe
Copy link
Author

thoraxe commented Feb 14, 2025

@leseb nice! glad I was able to clear it up.

leseb added a commit to leseb/llama-stack that referenced this issue Feb 18, 2025
Ensure that the options follow the correct priority order:

1. Environment variable (eg: `LLAMA_STACK_PORT`) takes the highest precedence.
2. CLI flag (`--port`) is considered only if the env variable is unset.
3. Config file value (`config.server.port`) is used as a fallback.
4. Default port (`8321`) is the last resort.

Closes: meta-llama#1076
Signed-off-by: Sébastien Han <[email protected]>
leseb added a commit to leseb/llama-stack that referenced this issue Feb 19, 2025
Ensure that the options follow the correct priority order:

1. Environment variable (eg: `LLAMA_STACK_PORT`) takes the highest precedence.
2. CLI flag (`--port`) is considered only if the env variable is unset.
3. Config file value (`config.server.port`) is used as a fallback.
4. Default port (`8321`) is the last resort.

Closes: meta-llama#1076
Signed-off-by: Sébastien Han <[email protected]>
leseb added a commit to leseb/llama-stack that referenced this issue Feb 19, 2025
Ensure that the options follow the correct priority order:

1. Environment variable (eg: `LLAMA_STACK_PORT`) takes the highest precedence.
2. CLI flag (`--port`) is considered only if the env variable is unset.
3. Config file value (`config.server.port`) is used as a fallback.
4. Default port (`8321`) is the last resort.

Closes: meta-llama#1076
Signed-off-by: Sébastien Han <[email protected]>
leseb added a commit to leseb/llama-stack that referenced this issue Feb 19, 2025
Ensure that the options follow the correct priority order:

1. Environment variable (eg: `LLAMA_STACK_PORT`) takes the highest precedence.
2. CLI flag (`--port`) is considered only if the env variable is unset.
3. Config file value (`config.server.port`) is used as a fallback.
4. Default port (`8321`) is the last resort.

Closes: meta-llama#1076
Signed-off-by: Sébastien Han <[email protected]>
leseb added a commit to leseb/llama-stack that referenced this issue Feb 24, 2025
Ensure that the options follow the correct priority order:

1. Environment variable (eg: `LLAMA_STACK_PORT`) takes the highest precedence.
2. CLI flag (`--port`) is considered only if the env variable is unset.
3. Config file value (`config.server.port`) is used as a fallback.
4. Default port (`8321`) is the last resort.

Closes: meta-llama#1076
Signed-off-by: Sébastien Han <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants