Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AM i dumb? #48

Open
harryeffinpotter opened this issue May 7, 2024 · 7 comments
Open

AM i dumb? #48

harryeffinpotter opened this issue May 7, 2024 · 7 comments

Comments

@harryeffinpotter
Copy link

I cant seem to get this working. I did ollama pull dolphin-mixtral:latest, it pulled it, then i do in a tmux instance ollama serve, then i try to use my bot with dolphin-mixtral:latest in the INIT environment setting in the .env file, yet it says 0 models when i hit settings????

@nierto
Copy link
Contributor

nierto commented May 18, 2024

I suggest you create a specific user for ollama, run ollama and pull some models. Get this project from git in your home folder and then cd into the directory where the run.py file is located. First sudo nano .env and copy past the necessary environment variables like the telegram bot key and the admin user id for telegram etc. Then run the script as:

nohup python3 run.py > output.txt 2>&1 &

then just exit. I created a cron to handle fresh reboot and potential updates around 3am to then just run the script automatically as the right user with the nohup command from above. (i also keep the output of the previous day and save it with a date appended to it in an archive dir I created myself).

@gabyavra
Copy link

hello, can you share the cron?
also, does it uses gpu by default? it seems quite slow and when I run nvidia-smi it doen not seem fully used. Thank you

@harryeffinpotter
Copy link
Author

I suggest you create a specific user for ollama, run ollama and pull some models. Get this project from git in your home folder and then cd into the directory where the run.py file is located. First sudo nano .env and copy past the necessary environment variables like the telegram bot key and the admin user id for telegram etc. Then run the script as:

nohup python3 run.py > output.txt 2>&1 &

then just exit. I created a cron to handle fresh reboot and potential updates around 3am to then just run the script automatically as the right user with the nohup command from above. (i also keep the output of the previous day and save it with a date appended to it in an archive dir I created myself).

I wanted to just make a systemd service, same thing as the nohup command, just differnt methodology.

Restart: always, i guess the user thing might help. I dont get why I seem to be the only one getting this damn error. Maybe my pyenv is messed up. Will report back.

@harryeffinpotter
Copy link
Author

Maybe i should run the bot as sudo? Does the bot have issues running on arch?

@harryeffinpotter
Copy link
Author

In case this helps, this is in my .env:


# UNCOMMENT ONE OF THE FOLLOWING LINES:
OLLAMA_BASE_URL=localhost # to run ollama without docker, using run.py
# OLLAMA_BASE_URL=ollama-server # to run ollama in a docker container
# OLLAMA_BASE_URL=host.docker.internal # to run ollama locally

# Log level
# https://docs.python.org/3/library/logging.html#logging-levels
LOG_LEVEL=DEBUG```

@harryeffinpotter
Copy link
Author

harryeffinpotter commented Jul 17, 2024

oh wow im stupid, i put localhost without just seeing it was already provided for me. MY GOD IM DUMB.

WAIT no im not, that line was there, and i uncommented it. oops. the localhost one. yeah, i want to just python run.py

I dont have to python bot/run.py for working directory reasons, do I?

@harryeffinpotter
Copy link
Author

ClientConnectorError: Cannot connect to host localhost:11435 ssl:default [Connection refused]
Traceback (most recent call last):
File "/usr/local/lib/python3.12/dist-packages/aiohttp/connector.py", line 1025, in _wrap_create_connection
return await self._loop.create_connection(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "uvloop/loop.pyx", line 2038, in create_connection
File "uvloop/loop.pyx", line 2015, in uvloop.loop.Loop.create_connection
ConnectionRefusedError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/dispatcher.py", line 309, in _process_update
response = await self.feed_update(bot, update, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/dispatcher.py", line 158, in feed_update
response = await self.update.wrap_outer_middleware(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/middlewares/error.py", line 25, in call
return await handler(event, data)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/middlewares/user_context.py", line 27, in call
return await handler(event, data)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/fsm/middleware.py", line 41, in call
return await handler(event, data)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/event/telegram.py", line 121, in trigger
return await wrapped_inner(event, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/event/handler.py", line 43, in call
return await wrapped()
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/dispatcher.py", line 276, in _listen_update
return await self.propagate_event(update_type=update_type, event=event, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/router.py", line 116, in propagate_event
return await observer.wrap_outer_middleware(_wrapped, event=event, data=kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/router.py", line 111, in _wrapped
return await self._propagate_event(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/router.py", line 136, in _propagate_event
response = await observer.trigger(event, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/event/telegram.py", line 121, in trigger
return await wrapped_inner(event, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiogram/dispatcher/event/handler.py", line 43, in call
return await wrapped()
^^^^^^^^^^^^^^^
File "/home/ysg/ollama/bot/run.py", line 103, in modelmanager_callback_handler
models = await model_list()
^^^^^^^^^^^^^^^^^^
File "/home/ysg/ollama/bot/func/functions.py", line 37, in model_list
async with session.get(url) as response:
File "/usr/local/lib/python3.12/dist-packages/aiohttp/client.py", line 1197, in aenter
self._resp = await self._coro
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiohttp/client.py", line 581, in _request
conn = await self._connector.connect(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiohttp/connector.py", line 544, in connect
proto = await self._create_connection(req, traces, timeout)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiohttp/connector.py", line 944, in _create_connection
_, proto = await self._create_direct_connection(req, traces, timeout)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiohttp/connector.py", line 1257, in _create_direct_connection
raise last_exc
File "/usr/local/lib/python3.12/dist-packages/aiohttp/connector.py", line 1226, in _create_direct_connection
transp, proto = await self._wrap_create_connection(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/aiohttp/connector.py", line 1033, in _wrap_create_connection
raise client_error(req.connection_key, exc) from exc
aiohttp.client_exceptions.ClientConnectorError: Cannot connect to host localhost:11435 ssl:default [Connection refused]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants