-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Publish API #22
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
|
||
This guide will help you set up and interact with the database. | ||
|
||
## Installing Diesel |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in quickstart(s) we should try to push for ephemeral tooling - which is most cases is containers.
E.g. postgres in container; pgadmin UI as a way to view the db, provide a docker-compose.yml which would spin up db, pgadmin, and run startup script (or inline) which would perform the migrations.
This would give external dev(s) a great quick start point to get the whole infra running locally.
Can be in future PR(s) btw.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't really used pgadmin; I have used psql but I prefer DBeaver. I think it's up to the developer what DB client they want to use. I could mention more of them here or just omit this section. Why do you think pgadmin is better/ should be recommended?
The migrations are performed automatically by diesel when the server starts.
This would give external dev(s) a great quick start point to get the whole infra running locally.
There are two scripts that serve this purpose:
start_local_db.sh
: starts the DB in dockerstart_local_server.sh
: starts both the DB and server in docker, in a shared network
docker-compose would be better but I don't have much experience writing it and found it more convenient to write/user these bash scripts. We should circle back to it: #24
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pgadmin is good just because everything can be run via docker-compose (it's all ephemeral).
Agree about the db-client being a developer opinion/option; I just think we should provide something which is ephemeral (not required on the machine) - even better if can just be provided by docker (whatever it is - pgadmin comes to mind first for me atleast).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice makes sense; yeap an issue is good start point - someone else (or even myself) can look into it 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work! 🚀
Suggested a few improvements; core functionality is there though!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work on this.
Related #9
This PR exposes a /publish endpoint that accept all metadata about a package and stores it in the database. There is also a bit of refactoring and code cleanup.
This PR also:
.env.local
which @zees-dev had previously suggestedCloses Replace
SystemTime
with something timezone-aware #21There is still more to do on the publishing front: