Skip to content

Commit

Permalink
initial github ci jobs
Browse files Browse the repository at this point in the history
  • Loading branch information
rymarczy committed Jan 14, 2025
1 parent 5ba05f2 commit e981bca
Show file tree
Hide file tree
Showing 7 changed files with 147 additions and 19 deletions.
15 changes: 10 additions & 5 deletions .envrc
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,11 @@ layout python3
# this fuction allows project dependencies to be managed by pip-tools
# a 'pyproject.toml' file is required in the root of the project
function use_pip-tools() {
local python_version
python_version=$(python3 -c "import platform; print(platform.python_version())")
# if VIRTUAL_ENV is not set to direnv default, delete direnv default dir as clean-up
if [[ "${VIRTUAL_ENV}" != *"$(direnv_layout_dir)/python-$python_version"* ]]; then
echo "custom VIRTUAL_ENV set, deleting $(direnv_layout_dir)"
if [[ "${VIRTUAL_ENV}" != *"$(direnv_layout_dir)/python-$python_version"* && -d $(direnv_layout_dir) ]]; then
echo "[use pip-tools] custom VIRTUAL_ENV set ($VIRTUAL_ENV), deleting ($(direnv_layout_dir))"
rm -rf $(direnv_layout_dir)
fi

Expand All @@ -31,20 +33,23 @@ function use_pip-tools() {
fi

if [ ! -f "pyproject.toml" ]; then
echo "[use pip-tools] ERROR: No 'pyproject.toml' file exists, please create..."
echo "[use pip-tools] ERROR: No 'pyproject.toml' file exists, please create one!"
return 1
fi

echo "[use pip-tools] resyncing (dev)requirements"
echo "[use pip-tools] syncing Python (dev)requirements..."
# create requirements files
pip-compile --quiet --strip-extras -o requirements.txt
pip-compile --quiet --strip-extras --extra=dev -o requirements-dev.txt
# install all pinned requirements to virtual environment
pip-sync requirements.txt requirements-dev.txt
# install local project from /src as editable, so code changes are reflected without re-install
echo "[use pip-tools] installing project to Python virtual environment ($VIRTUAL_ENV)..."
pip install --editable .

watch_file pyproject.toml
echo "[use pip-tools] Python virtual environment sucessfully created, updated and activated!"
echo "[use pip-tools] Running 'python --version' should show 'Python $python_version'"
}

use pip-tools
use pip-tools
30 changes: 30 additions & 0 deletions .github/actions/python_deps/action.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
name: Setup Python Dependencies
description: Loads python dependencies for a CI/CD job, install them if not cached

runs:
using: composite
steps:
- name: ASDF Tools Install
uses: ./.github/actions/tools

- name: Python Deps Cache
uses: actions/cache@v3
id: python-cache
with:
path: ./.venv
key: ${{ runner.os }}-python-deps-${{ hashFiles('**/requirements*.txt') }}

- run: python3 -m venv .venv
if: "!steps.python-cache.outputs.cache-hit"
shell: bash

- run: |
source .venv/bin/activate
echo "VIRTUAL_ENV=${VIRTUAL_ENV}" >> $GITHUB_ENV
echo "${VIRTUAL_ENV}/bin" >> $GITHUB_PATH
shell: bash
- run: pip install -r requirements.txt -r requirements-dev.txt
if: "!steps.python-cache.outputs.cache-hit"
shell: bash

24 changes: 24 additions & 0 deletions .github/actions/tools/action.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: Setup ASDF Tools
description: Loads ASDF tools for for a CI/CD job, installing them if not cached
outputs:
cache-hit:
description: "Whether the ASDF cache was hit"
value: ${{ steps.asdf-cache.outputs-cache-hit }}
runs:
using: composite
steps:
# cache the ASDF directory, using values from .tool-versions
- name: ASDF Tools Cache
uses: actions/cache@v3
id: asdf-cache
with:
path: ~/.asdf
# runner.os vs CACHE_UUID secret
key: ${{ runner.os}}-asdf-${{ hashFiles('**/.tool-versions') }}

- name: Install ASDF Tools
uses: asdf-vm/actions/install@v2
if: steps.asdf-cache.outputs.cache-hit != 'true'

- name: Re-shim ASDF Install
uses: mbta/actions/reshim-asdf@v1
67 changes: 67 additions & 0 deletions .github/workflows/ci_python.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
name: Continuous Integration (Python)

on:
push:
branches:
- main
paths:
- 'src/**'
- 'pyproject.toml'
- 'requirements-dev.txt'
- '.github/workflows/ci_python.yaml'
- '.github/python_deps/action.yaml'
pull_request:
paths:
- 'src/**'
- 'pyproject.toml'
- 'requirements-dev.txt'
- '.github/workflows/ci_python.yaml'
- '.github/python_deps/action.yaml'

defaults:
run:
shell: bash
working-directory: .

concurrency:
group: python-ci-${{ github.ref }}
cancel-in-progress: true

jobs:
setup:
name: Python Setup
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v3
- uses: ./.github/actions/python_deps

format:
name: Format
runs-on: ubuntu-22.04
needs: setup
steps:
- uses: actions/checkout@v3
- uses: ./.github/actions/python_deps

- run: ruff format --diff .

typing:
name: Type Check
runs-on: ubuntu-22.04
needs: setup
steps:
- uses: actions/checkout@v3
- uses: ./.github/actions/python_deps

- run: mypy .

lint:
name: Lint
runs-on: ubuntu-22.04
needs: setup
steps:
- uses: actions/checkout@v3
- uses: ./.github/actions/python_deps

- run: ruff check --diff .

2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ asdf install
`pip-tools` is used by to manage python dependencies.

> `direnv` will automatically create (and activate) a python virutal environemnt in the project folder for local development. `pip-tools` is also automatically run (by `direnv`) to install project dependencies when moving into the project directory.
>
> `pip-tools` creates [requirements.txt](requirements.txt) and [requirements-dev.txt](requirements-dev.txt) files containing versions and hashes of all python library dependencies for the project.
`docker` is required to run containerized version of application for local development.

Expand Down
4 changes: 3 additions & 1 deletion src/odin/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,9 @@
from odin.utils.logger import ProcessLog
from odin.utils.runtime import validate_env_vars


def start():
"""Application Entry"""
os.environ["SERVICE_NAME"] = "odin"
validate_env_vars(
required=[],
Expand All @@ -13,4 +15,4 @@ def start():


if __name__ == "main":
start()
start()
24 changes: 11 additions & 13 deletions src/odin/utils/aws/s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ def get_client() -> BaseClient:

def split_object(object: str) -> Tuple[str, str]:
"""
Split s3 object as "s3://bucket/object_key" into Tuple[bucket, key].
Split S3 object as "s3://bucket/object_key" into Tuple[bucket, key].
:param object: s3 object as "s3://bucket/object_key" or "bucket/object_key"
Expand All @@ -39,28 +39,26 @@ def split_object(object: str) -> Tuple[str, str]:


def list_objects(
bucket: str,
prefix: str,
partition: str,
max_objects: int = 1_000_000,
in_filter: Optional[str] = None,
) -> List[str]:
"""
Get list of S3 objects in 'bucket' starting with 'prefix'
Get list of S3 objects starting with 'partition'.
:param bucket: the name of the bucket with objects
:param prefix: prefix for objs to return
:param partition: S3 partition as "s3://bucket/prefix" or "bucket/prefix"
:param max_objects: maximum number of objects to return
:param in_filter: will filter for objects containing string
:return: List[s3://bucket/key, ...]
"""
logger = ProcessLog(
"list_objects",
bucket=bucket,
prefix=prefix,
partition=partition,
max_objects=max_objects,
in_filter=in_filter,
)
bucket, prefix = split_object(partition)
try:
client = get_client()
paginator = client.get_paginator("list_objects_v2")
Expand Down Expand Up @@ -111,7 +109,7 @@ def object_exists(object: str) -> bool:

def upload_file(file_name: str, object: str, extra_args: Optional[Dict] = None) -> bool:
"""
Upload a local file to an S3 Bucket
Upload a local file to S3 as an object.
:param file_name: local file path to upload
:param object: S3 object path as 's3://bucket/object' or 'bucket/object'
Expand Down Expand Up @@ -173,9 +171,9 @@ def download_object(object: str, local_path: str) -> bool:
return False


def get_object(object: str) -> StreamingBody:
def stream_object(object: str) -> StreamingBody:
"""
Get an S3 object as StreamingBody
Stream an S3 object as StreamingBody.
:param object: S3 object path as 's3://bucket/object' or 'bucket/object'
Expand All @@ -196,7 +194,7 @@ def get_object(object: str) -> StreamingBody:

def delete_object(object: str) -> bool:
"""
Delete s3 object
Delete an S3 object.
:param object: S3 object to delete as 's3://bucket/object' or 'bucket/object'
Expand All @@ -217,7 +215,7 @@ def delete_object(object: str) -> bool:

def rename_object(from_object: str, to_object: str) -> bool:
"""
Rename from_object TO to_object as copy and delete operation.
Rename an S3 object as copy and delete operation.
:param from_object: COPY from as 's3://bucket/object' or 'bucket/object'
:param to_object: COPY to as 's3://bucket/object' or 'bucket/object'
Expand Down

0 comments on commit e981bca

Please sign in to comment.