Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] OpenAILIKE additional params are not supported #1077

Open
cyx441984694 opened this issue Jan 29, 2025 · 1 comment
Open

[BUG] OpenAILIKE additional params are not supported #1077

cyx441984694 opened this issue Jan 29, 2025 · 1 comment
Assignees
Labels
AutoRAG Core From the core framework of AutoRAG bug Something isn't working

Comments

@cyx441984694
Copy link

cyx441984694 commented Jan 29, 2025

Describe the bug
OpenAILIKE additional params are not supported like is_chat_model param...

To Reproduce
Configure the following llm module and use is_chat_model param like the following:

    - node_type: generator
      modules:
        - batch: 2
          module_type: llama_index_llm
          llm: openailike
          model: qwen-plus
          is_chat_model: True
          api_base: https://dashscope.aliyuncs.com/compatible-mode/v1
          api_key: x

It will have error like 404 error with the endpoint '/v1/completions'. The error is because is_chat_model param is not passed into the model. If the model itself does not support /v1/completions endpoint, it will have error as it doesn't call /v1/chat/completions'.

Additional context
The root cause may be due to llm_instance params is using llm_class.init instead of the llm_class...

autorag/nodes/generator/llama_index_llm.py line 52 __init__ method

self.llm_instance: BaseLLM = llm_class(**pop_params(llm_class.__init__, kwargs))

When using openAILike, it is converting the OpenAI params with ignoring OpenAILike params....

Possible fix
Stupid but quick fix...

		llm_openaiLIke_params = None
		if llm_class.class_name() == "OpenAILike":
			llm_openaiLIke_params = pop_params(llm_class, kwargs)
		original_llm_params = pop_params(llm_class.__init__, kwargs)
		if llm_openaiLIke_params is not None:
			llm_params = {**llm_openaiLIke_params,**original_llm_params}
		else:
			llm_params = original_llm_params
		self.llm_instance: BaseLLM = llm_class(**llm_params)
@cyx441984694 cyx441984694 added the bug Something isn't working label Jan 29, 2025
@vkehfdl1
Copy link
Contributor

Thank you for registering bug!
It looks like the pop_params is not well functioning to the pydantic class. I might change it to support it.

@vkehfdl1 vkehfdl1 self-assigned this Jan 30, 2025
@vkehfdl1 vkehfdl1 added the AutoRAG Core From the core framework of AutoRAG label Jan 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
AutoRAG Core From the core framework of AutoRAG bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants