Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add nest_asyncio to fix ollama embed 'event loop closed' error #7625

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

tsmdt
Copy link

@tsmdt tsmdt commented Jan 8, 2025

Title

Use nest_asyncio to fix litellm.APIConnectionError: Event loop is closed using Ollama embeddings.

Relevant issues

Type

🐛 Bug Fix

Changes

Added nest_asyncio library to run asynchronous code inside environments that already have an active event loop, thus fixing the litellm.APIConnectionError: Event loop is closed.

[REQUIRED] Testing - Attach a screenshot of any new tests passing locally

After the fix this code works:

import litellm

test_strings = ['Research Data Charta', 'Research', 'Research Data']

for t in test_strings:
    response = litellm.embedding(input=t, model='ollama/nomic-embed-text:latest')
    print(response)

Copy link

vercel bot commented Jan 8, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 8, 2025 7:53am

@shanbady
Copy link

👍 this resolved the issue I was seeing when using ollama

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Ollama as custom provider does not default to sync by default
2 participants