Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor/config #46

Merged
merged 7 commits into from
Feb 5, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 33 additions & 10 deletions protollm_tools/llm-api/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,15 +105,21 @@ These variables must be configured and synchronized with the LLM-core system:

### Example `.env` File
```env
INNER_LLM_URL=localhost:8670
REDIS_HOST=localhost
# API
CELERY_BROKER_URL=amqp://admin:[email protected]:5672/
CELERY_RESULT_BACKEND=redis://127.0.0.1:6379/0
REDIS_HOST=redis
REDIS_PORT=6379
REDIS_PREFIX=llm-api
RABBIT_MQ_HOST=localhost
RABBIT_MQ_HOST=rabbitmq
RABBIT_MQ_PORT=5672
RABBIT_MQ_LOGIN=admin
RABBIT_MQ_PASSWORD=admin
QUEUE_NAME=llm-api-queue
WEB_RABBIT_MQ=15672
API_PORT=6672

# RabbitMQ
RABBITMQ_DEFAULT_USER=admin
RABBITMQ_DEFAULT_PASS=admin
```

---
Expand Down Expand Up @@ -161,14 +167,31 @@ Below is the architecture diagram for the interaction between API, RabbitMQ, LLM
### Running the API
1. Configure environment variables in the `.env` file.
2. Start the API using:
```python
app = FastAPI()
```python
app = FastAPI()

config = Config.read_from_env()

config = Config.read_from_env()
app.include_router(get_router(config))
```
### Running the API Locally (without Docker)
To run the API locally using Uvicorn, use the following command:

app.include_router(get_router(config))
```
```sh
uvicorn protollm_api.backend.main:app --host 127.0.0.1 --port 8000 --reload
```

Or use this main file:
```python
app = FastAPI()

config = Config.read_from_env()

app.include_router(get_router(config))

if __name__ == "__main__":
uvicorn.run("protollm_api.backend.main:app", host="127.0.0.1", port=8000, reload=True)
```
angrymuskrat marked this conversation as resolved.
Show resolved Hide resolved
### Example Request
#### Generate
```bash
Expand Down
26 changes: 9 additions & 17 deletions protollm_tools/llm-api/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,16 +8,9 @@ services:
context: .
dockerfile: Dockerfile
ports:
- "6672:6672"
environment:
CELERY_BROKER_URL: amqp://admin:[email protected]:5672/
CELERY_RESULT_BACKEND: redis://10.32.15.21:6379/0
REDIS_HOST: redis
REDIS_PORT: 6379
RABBIT_MQ_HOST: rabbitmq
RABBIT_MQ_PORT: 5672
RABBIT_MQ_LOGIN: admin
RABBIT_MQ_PASSWORD: admin
- ${API_PORT}:6672
env_file:
- .env
volumes:
- ./unit_config.json:/docker-entrypoint.d/unit_config.json
networks:
Expand All @@ -26,11 +19,10 @@ services:
rabbitmq:
image: "rabbitmq:3-management"
ports:
- "5672:5672" # RabbitMQ broker port
- "15672:15672" # RabbitMQ management interface
environment:
- RABBITMQ_DEFAULT_USER=admin
- RABBITMQ_DEFAULT_PASS=admin
- ${RABBIT_MQ_PORT}:5672
- ${WEB_RABBIT_MQ}:15672
env_file:
- .env
volumes:
- rabbitmq_data:/var/lib/rabbitmq
networks:
Expand All @@ -39,7 +31,7 @@ services:
redis:
image: "redis:alpine"
ports:
- "6379:6379"
- ${REDIS_PORT}:6379
volumes:
- redis_data:/var/lib/data
networks:
Expand All @@ -52,4 +44,4 @@ networks:

volumes:
rabbitmq_data:
redis_data:
redis_data:
Loading