-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AM i dumb? #48
Comments
I suggest you create a specific user for ollama, run ollama and pull some models. Get this project from git in your home folder and then cd into the directory where the run.py file is located. First sudo nano .env and copy past the necessary environment variables like the telegram bot key and the admin user id for telegram etc. Then run the script as: nohup python3 run.py > output.txt 2>&1 & then just exit. I created a cron to handle fresh reboot and potential updates around 3am to then just run the script automatically as the right user with the nohup command from above. (i also keep the output of the previous day and save it with a date appended to it in an archive dir I created myself). |
hello, can you share the cron? |
I wanted to just make a systemd service, same thing as the nohup command, just differnt methodology. Restart: always, i guess the user thing might help. I dont get why I seem to be the only one getting this damn error. Maybe my pyenv is messed up. Will report back. |
Maybe i should run the bot as sudo? Does the bot have issues running on arch? |
In case this helps, this is in my .env:
|
oh wow im stupid, i put localhost without just seeing it was already provided for me. MY GOD IM DUMB. WAIT no im not, that line was there, and i uncommented it. oops. the localhost one. yeah, i want to just python run.py I dont have to python bot/run.py for working directory reasons, do I? |
ClientConnectorError: Cannot connect to host localhost:11435 ssl:default [Connection refused] The above exception was the direct cause of the following exception: Traceback (most recent call last): |
I cant seem to get this working. I did ollama pull dolphin-mixtral:latest, it pulled it, then i do in a tmux instance ollama serve, then i try to use my bot with dolphin-mixtral:latest in the INIT environment setting in the .env file, yet it says 0 models when i hit settings????
The text was updated successfully, but these errors were encountered: