We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Learn more about funding links in repositories.
Report abuse
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When the token output cost for certain models is specified as 0, it is not displayed as 0 on the WebUI, but the input cost is correctly shown as 0.
Config:
model_list: - model_name: deepseek-r1 litellm_params: model: deepseek/deepseek-reasoner api_key: os.environ/DEEPSEEK_API_KEY api_base: os.environ/DEEPSEEK_API_BASE max_tokens: 8192 - model_name: deepseek-r1 litellm_params: model: openai/<REDACTED> api_base: <REDACTED> api_key: <REDACTED> input_cost_per_token: 0 output_cost_per_token: 0 - model_name: deepseek-r1 litellm_params: model: github/DeepSeek-R1 api_key: os.environ/GITHUB_API_KEY input_cost_per_token: 0 output_cost_per_token: 0 - model_name: deepseek-r1 litellm_params: model: openrouter/deepseek/deepseek-r1 api_key: os.environ/OPENROUTER_API_KEY input_cost_per_token: 0.000008 output_cost_per_token: 0.000008 - model_name: deepseek-r1 litellm_params: model: perplexity/sonar-reasoning api_key: os.environ/PERPLEXITY_API_KEY input_cost_per_token: 0.000001 output_cost_per_token: 0.000005
No
1.59.9
No response
The text was updated successfully, but these errors were encountered:
No branches or pull requests
What happened?
When the token output cost for certain models is specified as 0, it is not displayed as 0 on the WebUI, but the input cost is correctly shown as 0.
Config:
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
1.59.9
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: