Replies: 2 comments 13 replies
-
I also tried the home-3b model as well now, same thing pretty much. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Yes you get that with other models, or a modified prompt.
What model did you try before?
…On Mon, Feb 12, 2024 at 1:24 AM paulash ***@***.***> wrote:
I also tried the home-3b model as well now, same thing pretty much.
—
Reply to this email directly, view it on GitHub
<#62 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A64GEKHGPE3ZSKQBISETQRLYTGRQPAVCNFSM6AAAAABDEEMMECVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4DIMZXGQZDM>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
12 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
title says it all, i'm using a local ollama api to run my LLM which works fine if i'm just asking a question but if I ask it to turn on a light for example, it 'says' it turned on the light but nothing actually happens? I'm using just the phi:latest model for now just to get things working. Any help would be appreciated!
Beta Was this translation helpful? Give feedback.
All reactions