You need to install poetry, it will handle the virtual environment creation for the project in order to sandbox the Python environment, as well as manage the dependency installation, among other things.
Start all dependent services using docker compose (this will start PostgreSQL, Elasticsearch 6, RabbitMQ and Redis):
$ docker compose up -d
Note
Make sure you have enough virtual memory for Elasticsearch in Docker:
# Linux
$ sysctl -w vm.max_map_count=262144
# macOS
$ screen ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/tty
<enter>
linut00001:~# sysctl -w vm.max_map_count=262144
Next, bootstrap the instance (this will install all Python dependencies and build all static assets):
$ poetry run poe bootstrap
Next, create database tables, search indexes and message queues:
$ ./scripts/setup
Start the webserver and the celery worker:
$ ./scripts/server
Start a Python shell:
$ ./scripts/console
In order to upgrade an existing instance simply run:
$ ./scripts/update
Run the test suite via the provided script:
$ poetry run poe run_tests
By default, end-to-end tests are skipped. You can include the E2E tests like this:
$ env E2E=yes poetry run poe run_tests
For more information about end-to-end testing see pytest-invenio
You can build the documentation with:
$ poetry run build_sphinx
You can use simulate a full production environment using the
docker-compose.full.yml
. You can start it like this:
$ docker build --rm -t rero-mef-base:latest -f Dockerfile.base .
$ docker compose -f docker-compose.full.yml up -d
In addition to the normal docker-compose.yml
, this one will start:
- HAProxy (load balancer)
- Nginx (web frontend)
- UWSGI (application container)
- Celery (background task worker)
- Flower (Celery monitoring)