Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Created Playground Containerfile and Image Workflow #1256

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
91 changes: 91 additions & 0 deletions .github/workflows/build_playground.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
name: "Build and Push Playground Image"

env:
IMAGE_NAME: "jland/llama-stack-playground"
IMAGE_TAG_LATEST: "latest"
REGISTRY_URL: "quay.io" # Replace with your registry URL
REGISTRY_USERNAME: ${{ secrets.REGISTRY_USERNAME }}
REGISTRY_PASSWORD: ${{ secrets.REGISTRY_PASSWORD }}

on:
pull_request:
paths:
- 'llama_stack/distribution/ui/**'
push:
paths:
- 'llama_stack/distribution/ui/**'
tags:
- '*'

jobs:
########################################
#### Validate Code and Image Builds ####
########################################
validate-code:
name: Validate Playground Code
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.9'

- name: "Validate Code"
id: validate_code
run: |
cd llama_stack/distribution/ui
pip install -r requirements.txt
nohup streamlit run app.py &
sleep 10
curl -f http://localhost:8501 || exit 1

########################################
#### Validate Code and Image Builds ####
########################################
validate-image:
name: Validate Playground Image Build
runs-on: ubuntu-latest
if: github.event_name == 'pull_request'
steps:
- name: Checkout code
uses: actions/checkout@v4

# Build and Push image to registry with tag matching the Git Tag and latest
- name: Build and push
uses: docker/build-push-action@v6
with:
context: ./llama_stack/distribution/ui/
file: ./llama_stack/distribution/ui/Containerfile
push: false
tags: ${{ env.REGISTRY_URL }}/${{ env.IMAGE_NAME }}:local
ulimit: nofile=4096


validate-and-deploy-image:
name: Validate and Deploy Playground Image
runs-on: ubuntu-latest
needs: validate-code
if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags/')
steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Login to Image Registry
uses: docker/login-action@v3
with:
username: ${{ secrets.REGISTRY_USERNAME }}
password: ${{ secrets.REGISTRY_PASSWORD }}
registry: ${{ env.REGISTRY_URL }}

# Build and Push image to registry with tag matching the Git Tag and latest
- name: Build and push
uses: docker/build-push-action@v6
with:
context: ./llama_stack/distribution/ui/
file: ./llama_stack/distribution/ui/Containerfile
push: true
tags: ${{ env.REGISTRY_URL }}/${{ env.IMAGE_NAME }}:${{ github.ref_name }}, ${{ env.REGISTRY_URL }}/${{ env.IMAGE_NAME }}:${{ env.IMAGE_TAG_LATEST }}
ulimit: nofile=4096
8 changes: 8 additions & 0 deletions llama_stack/distribution/ui/Containerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM python:3.9-slim
WORKDIR /app
COPY . /app/
RUN /usr/local/bin/python -m pip install --upgrade pip && \
/usr/local/bin/pip3 install -r requirements.txt
EXPOSE 8501

ENTRYPOINT ["streamlit", "run", "app.py", "--server.port=8501", "--server.address=0.0.0.0"]
10 changes: 10 additions & 0 deletions llama_stack/distribution/ui/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,3 +40,13 @@ cd llama_stack/distribution/ui
pip install -r requirements.txt
streamlit run app.py
```

## Environment Variables

| Environment Variable | Description | Default Value |
|----------------------------|------------------------------------|---------------------------|
| LLAMA_STACK_ENDPOINT | The endpoint for the Llama Stack | http://localhost:8321 |
| FIREWORKS_API_KEY | API key for Fireworks provider | (empty string) |
| TOGETHER_API_KEY | API key for Together provider | (empty string) |
| SAMBANOVA_API_KEY | API key for SambaNova provider | (empty string) |
| OPENAI_API_KEY | API key for OpenAI provider | (empty string) |
Loading