-
Notifications
You must be signed in to change notification settings - Fork 71
Iterative Development Workflow
!!! This documentation is outdated. New version here.
One of the reasons developer avoid puting any logic in the database is because it's a pain to move that logic from a file (storred in git) to the environment (the database) where it is executed. In many ways it's similar to the workflow you have when developing in a compile language like C/Java/Go where is an aditional step, compiling, before you can see the code in action. But it databases it's a bit worse because it's not as easey to move that code as a one line command to compile a C program.
This is the reason we developed the subzero-cli. The goal is to make the development workflow for writing code that lives in the databse (tables/views/stored procedures/...) as close as possible to the workflow of a Ruby/PHP/Node developer. We want to be able to just save a (sql) file and have that logic immediatly running in the database, ready for you to execute. While moving SQL code from files to the database is a core feature, subzero-cli
does a lot more, it will also do the same thing for nginx configs and lua code (running in nginx). It will also give you a nice interface to look at the logs of each component of the stack and see the result of a REST call to the api in addition to creating and managing your database migrations.
Check out how the whole thing looks (although you don't in the gif the process of saving the file).
Follow the install instructions.
If you won't be using for now the migrations
subcommand (which is used only when starting to move code to production) you can skip the dependencies step and just run npm install -g subzero-cli
and you are done.
In the root of your project (which is a clean version of the starter kit) execute this
docker-compose -d up db postgrest openresty
subzero dashboard
Notice how we chose to bring up only a subset of the stack components since we won't be using rabbitmq for now. In another terminal window, run the following command
curl http://localhost:8080/rest/todos?select=id,todo
# result
[{"id":1,"todo":"item_1"},{"id":3,"todo":"item_3"},{"id":6,"todo":"item_6"}]
Notice the result of this in each dashboard window/tab (OpenResty/PostgREST/PostgreSQL) each showing a log line of the event they registered. Probably the most interesting window for now will be PostgreSQL where you can see the actual queries that were executed for that REST call, which in this case look something like this (with a bit of formatting compared to the real thing)
LOG: execute 0: BEGIN ISOLATION LEVEL READ COMMITTED READ ONLY
LOG: statement: set local role 'anonymous';set local "request.jwt.claim.role" = 'anonymous';set local "request.header.host" = 'postgrest';set local "request.header.user-agent" = 'curl/7.43.0';set local "request.header.accept" = '*/*'; │
LOG: execute <unnamed>:
WITH pg_source AS (
SELECT "api"."todos"."id", "api"."todos"."todo" FROM "api"."todos"
)
SELECT
null AS total_result_set,
pg_catalog.count(_postgrest_t) AS page_total,
array[]::text[] AS header,
coalesce(array_to_json(array_agg(row_to_json(_postgrest_t))), '[]')::character varying AS body
FROM ( SELECT * FROM pg_source) _postgrest_t
LOG: execute 1: COMMIT
Now we'll change the todos
view.
Change this line
#before
select id, todo, private, (owner_id = request.user_id()) as mine from data.todo;
#after
select ('#' || id::text) as id, ('do this: ' || todo) as todo, private, (owner_id = request.user_id()) as mine from data.todo;
Save the file, then run the previous curl command
curl http://localhost:8080/rest/todos?select=id,todo
#result
[{"id":"#1","todo":"do this: item_1"},{"id":"#3","todo":"do this: item_3"},{"id":"#6","todo":"do this: item_6"}]
And that's the gist of it. You save our code, be that SQL/Lua/Nginx and it's immediatly running live in your dev stack.