Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AI pages does not work with OLLAMA #509

Open
fabricemrchl opened this issue Feb 25, 2025 · 2 comments
Open

AI pages does not work with OLLAMA #509

fabricemrchl opened this issue Feb 25, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@fabricemrchl
Copy link

Describe the bug
AI chat does not work with OLLAMA.
When usign OLLAMA as AI provider, chat does not display answer from OLLAMA.

It works with OpenAI API.

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'AI page'
  2. Put somthin in the chat and send the message
  3. See error. AI answer is not displayed

Expected behavior
AI answer displayed in the chat. Works with OpenAI API.

Screenshots
Screenshot attached

Image

Additional context
OLLAMA instance run locally. Connection test succed in admin panel. URL used https://ollama_server_hostname/api/

Copy link

dosubot bot commented Feb 25, 2025

@fabricemrchl Salut! Je suis là pour vous aider avec votre problème.

The issue you're experiencing with OLLAMA not displaying responses in the AI chat interface might be related to how the responses from OLLAMA are processed or displayed in the codebase. Here are a few areas to investigate:

  1. Response Processing: Ensure that the response from OLLAMA is being correctly processed in the onInputSubmit method of the AiStore class. This method is responsible for handling the AI response and updating the chat interface [1].

  2. Error Handling: Check if there are any errors being thrown or logged when using OLLAMA. The codebase uses TRPCError for handling errors, which might provide insights if something is going wrong during the request or response handling [2].

  3. Configuration and Headers: Verify that the API request to OLLAMA is correctly constructed, including any necessary headers or parameters. Although specific details about the OLLAMA request construction weren't found, ensuring that the request matches OLLAMA's expected format is crucial.

  4. Compatibility and Logging: Look for any logging or debugging information that might indicate issues specific to OLLAMA. This could help identify if there are compatibility issues or if additional configurations are needed.

By focusing on these areas, you might be able to identify the root cause of the issue with OLLAMA.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

@moirk
Copy link

moirk commented Feb 25, 2025

Same issue for me

@dosubot dosubot bot added the bug Something isn't working label Feb 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants