You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
According to Mervin Praison in his youtub video "RawDog: How I Cut My Work Hours in Half with Context-Aware AI?" (https://www.youtube.com/watch?v=5r86vtZcvJ4)
rawdog --llm-base-url http://:1234/v1 --llm-custom-provider openai --llm-model
This forces a setting of openai key because the provider is openai
Due to a "bug" with rawdog you can NOT specify it on the command line, it will still complain, even if you specify the key on the command line
To make this work you must set the environment variable for openai key to get this to work
[change for your operating system]
export OPENAI_API_KEY=XXXXXXXXXXXXXXXXXXXXXXXXX
The number of characters I believe doesn't matter
The rawdog command should work now
example:
export OPENAI_API_KEY=XXXXXXXXXXXXXXXXXXXXXXXXX
rawdog --retries 1 --llm-base-url http://192.168.1.38:1234/v1 --llm-custom-provider openai --llm-model Orenguteng/Llama-3.1-8B-Lexi-Uncensored-GGUF count the number of files in the directory
the instructions aren't clear on how to run it on lm studio server
The text was updated successfully, but these errors were encountered: