Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lm studio #91

Open
Shehab-Ecstasyandfire opened this issue Jun 5, 2024 · 1 comment
Open

lm studio #91

Shehab-Ecstasyandfire opened this issue Jun 5, 2024 · 1 comment

Comments

@Shehab-Ecstasyandfire
Copy link

the instructions aren't clear on how to run it on lm studio server

@greggft
Copy link

greggft commented Aug 12, 2024

According to Mervin Praison in his youtub video "RawDog: How I Cut My Work Hours in Half with Context-Aware AI?" (https://www.youtube.com/watch?v=5r86vtZcvJ4)
rawdog --llm-base-url http://:1234/v1 --llm-custom-provider openai --llm-model

This forces a setting of openai key because the provider is openai

Due to a "bug" with rawdog you can NOT specify it on the command line, it will still complain, even if you specify the key on the command line

To make this work you must set the environment variable for openai key to get this to work
[change for your operating system]
export OPENAI_API_KEY=XXXXXXXXXXXXXXXXXXXXXXXXX
The number of characters I believe doesn't matter

The rawdog command should work now

example:
export OPENAI_API_KEY=XXXXXXXXXXXXXXXXXXXXXXXXX
rawdog --retries 1 --llm-base-url http://192.168.1.38:1234/v1 --llm-custom-provider openai --llm-model Orenguteng/Llama-3.1-8B-Lexi-Uncensored-GGUF count the number of files in the directory

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants