Skip to content

Latest commit

 

History

History
141 lines (109 loc) · 4.53 KB

README.md

File metadata and controls

141 lines (109 loc) · 4.53 KB

Overview

  1. Prerequisites
  2. Configure credentials
  3. Setup Postgres (optional)
  4. Setup virtual environment
  5. Installation for development
  6. Run the integration tests
  7. Run tests
  8. Creating a new integration test

Prerequisites

  • python3
  • Docker (optional)

Configure credentials

Edit the env file for your TARGET in integration_tests/.env/[TARGET].env.

Load the environment variables:

set -a; source integration_tests/.env/[TARGET].env; set +a

or more specific:

set -a; source integration_tests/.env/postgres.env; set +a

Setup Postgres (optional)

Docker and docker-compose are both used in testing. Specific instructions for your OS can be found here.

Postgres offers the easiest way to test most dbt-codegen functionality today. Its tests are the fastest to run, and the easiest to set up. To run the Postgres integration tests, you'll have to do one extra step of setting up the test database:

make setup-db

or, alternatively:

docker-compose up --detach postgres

Setup virtual environment

We strongly recommend using virtual environments when developing code in dbt-codegen. We recommend creating this virtualenv in the root of the dbt-codegen repository. To create a new virtualenv, run:

python3 -m venv env
source env/bin/activate

This will create and activate a new Python virtual environment.

Installation for development

First make sure that you set up your virtual environment as described above. Also ensure you have the latest version of pip and setuptools installed:

python -m pip install --upgrade pip setuptools

Next, install dbt-core (and its dependencies) with:

make dev target=[postgres|redshift|...]
# or
pip install --pre dbt-core dbt-[postgres|redshift|...]

or more specific:

make dev target=postgres
# or
pip install --pre dbt-core dbt-postgres

Run the integration tests

To run all the integration tests on your local machine like they will get run in the CI (using CircleCI):

make test target=[postgres|redshift|...]
# or
./run_test.sh [postgres|redshift|...]

or more specific:

make test target=postgres
# or
./run_test.sh postgres

Where possible, targets can run in docker containers (this works for Postgres or in the future Spark for example). For managed services like Snowflake, BigQuery and Redshift this is not possible, hence your own configuration for these services has to be provided in the appropriate env files in integration_tests/.env/[TARGET].env

Creating a new integration test

Set up profiles

Do one of the following:

  1. Use the profiles.yml in the current working directory for dbt Core 1.3 and above
    cp integration_tests/ci/sample.profiles.yml integration_tests/profiles.yml
  2. Use DBT_PROFILES_DIR
    cp integration_tests/ci/sample.profiles.yml integration_tests/profiles.yml
    export DBT_PROFILES_DIR=$(cd integration_tests && pwd)
  3. Use ~/.dbt/profiles.yml
    • Copy contents from integration_tests/ci/sample.profiles.yml into ~/.dbt/profiles.yml.

Add your integration test

This directory contains an example dbt project which tests the macros in the dbt-codegen package. An integration test typically involves making:

  1. a new seed file
  2. a new model file
  3. a generic test to assert anticipated behaviour.

For an example of integration tests, check out the tests for the get_url_parameter macro in the dbt-utils project:

  1. Macro definition
  2. Seed file with fake data
  3. Model to test the macro
  4. A generic test to assert the macro works as expected

Once you've added all of these files, you should be able to run:

Assuming you are in the integration_tests folder,

dbt deps --target {your_target}
dbt seed --target {your_target}
dbt run --target {your_target} --model {your_model_name}
dbt test --target {your_target} --model {your_model_name}

Alternatively:

dbt deps --target {your_target}
dbt build --target {your_target} --select +{your_model_name}

If the tests all pass, then you're good to go! All tests will be run automatically when you create a PR against this repo.