You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| `Traceback (most recent call last):\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4277, in _get_model_info_helper\n raise ValueError(\n "This model isn't mapped yet. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json"\n )\nValueError: This model isn't mapped yet. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/litellm_logging.py", line 832, in _response_cost_calculator\n response_cost = litellm.response_cost_calculator(\n **response_cost_calculator_kwargs\n )\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 830, in response_cost_calculator\n raise e\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 818, in response_cost_calculator\n response_cost = completion_cost(\n completion_response=response_object,\n ...<6 lines>...\n prompt=prompt,\n )\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 768, in completion_cost\n raise e\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 747, in completion_cost\n ) = cost_per_token(\n ~~~~~~~~~~~~~~^\n model=model,\n ^^^^^^^^^^^^\n ...<13 lines>...\n audio_transcription_file_duration=audio_transcription_file_duration,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 287, in cost_per_token\n model_info = _cached_get_model_info_helper(\n model=model, custom_llm_provider=custom_llm_provider\n )\n File "/usr/lib/python3.13/site-packages/litellm/caching/_internal_lru_cache.py", line 25, in wrapped\n raise result[1]\n File "/usr/lib/python3.13/site-packages/litellm/caching/_internal_lru_cache.py", line 18, in wrapper\n return ("success", f(*args, **kwargs))\n ~^^^^^^^^^^^^^^^^^\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4155, in _cached_get_model_info_helper\n return _get_model_info_helper(model=model, custom_llm_provider=custom_llm_provider)\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4384, in _get_model_info_helper\n raise Exception(\n ...<3 lines>...\n )\nException: This model isn't mapped yet. model=bedrock/us-east-2/us.anthropic.claude-3-haiku-20240307-v1:0, custom_llm_provider=bedrock. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json.\n`
-- | --
| "us.anthropic.claude-3-haiku-20240307-v1:0"
-- | --
it should be the same price as the OD model ID
Motivation, pitch
AWS is asking their customers to move toward CRIS
Are you a ML Ops Team?
No
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
resp=litellm.completion(
model="bedrock/us.anthropic.claude-3-haiku-20240307-v1:0",
messages=[{"role": "user", "content": "Hello, how are you?"}],
aws_region_name="us-east-1",
mock_response="Hello, how are you?",
)
assertresp._hidden_params["response_cost"] >0
The Feature
"This model isn't mapped yet. model=bedrock/us-east-2/us.anthropic.claude-3-haiku-20240307-v1:0, custom_llm_provider=bedrock. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json."
| `Traceback (most recent call last):\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4277, in _get_model_info_helper\n raise ValueError(\n "This model isn't mapped yet. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json"\n )\nValueError: This model isn't mapped yet. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/litellm_logging.py", line 832, in _response_cost_calculator\n response_cost = litellm.response_cost_calculator(\n **response_cost_calculator_kwargs\n )\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 830, in response_cost_calculator\n raise e\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 818, in response_cost_calculator\n response_cost = completion_cost(\n completion_response=response_object,\n ...<6 lines>...\n prompt=prompt,\n )\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 768, in completion_cost\n raise e\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 747, in completion_cost\n ) = cost_per_token(\n ~~~~~~~~~~~~~~^\n model=model,\n ^^^^^^^^^^^^\n ...<13 lines>...\n audio_transcription_file_duration=audio_transcription_file_duration,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 287, in cost_per_token\n model_info = _cached_get_model_info_helper(\n model=model, custom_llm_provider=custom_llm_provider\n )\n File "/usr/lib/python3.13/site-packages/litellm/caching/_internal_lru_cache.py", line 25, in wrapped\n raise result[1]\n File "/usr/lib/python3.13/site-packages/litellm/caching/_internal_lru_cache.py", line 18, in wrapper\n return ("success", f(*args, **kwargs))\n ~^^^^^^^^^^^^^^^^^\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4155, in _cached_get_model_info_helper\n return _get_model_info_helper(model=model, custom_llm_provider=custom_llm_provider)\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4384, in _get_model_info_helper\n raise Exception(\n ...<3 lines>...\n )\nException: This model isn't mapped yet. model=bedrock/us-east-2/us.anthropic.claude-3-haiku-20240307-v1:0, custom_llm_provider=bedrock. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json.\n` -- | -- | "us.anthropic.claude-3-haiku-20240307-v1:0" -- | --it should be the same price as the OD model ID
Motivation, pitch
AWS is asking their customers to move toward CRIS
Are you a ML Ops Team?
No
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: