Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can agent use local model? #56

Open
dongdongzhaoUP opened this issue Dec 25, 2024 · 2 comments
Open

Can agent use local model? #56

dongdongzhaoUP opened this issue Dec 25, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@dongdongzhaoUP
Copy link

In addition to the API usage, can agents built on local models?

@dongdongzhaoUP dongdongzhaoUP added the enhancement New feature or request label Dec 25, 2024
@josh-ashkinaze
Copy link
Owner

Hi, thanks for the question!

This is something we may do more on in future versions. Officially no (since we didn't write tests/docs for this) but technically yes ---you can do this very easily using Ollama.

Here's a quick example for you.

  1. Download Ollama if you have't download it
  2. ollama start # starts ollama
  3. ollama pull gemma:2b # gets gemma2b
  4. ollama run gemma:2b # running locally

Now in Python

from plurals.agent Import Agent

a = Agent(task='Say hello', model="ollama/gemma:2b", 
          kwargs={'api_base':"http://localhost:11434"})
a.process()

# 'I am a large language model, trained by Google. I am a conversational AI that can engage in natural language conversations and provide information and insights.'

@dongdongzhaoUP
Copy link
Author

Thanks for your reply. It is useful, and wish the project gets better~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants