Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stop being API and website and just be dataset #45

Open
rufuspollock opened this issue Apr 18, 2015 · 20 comments
Open

Stop being API and website and just be dataset #45

rufuspollock opened this issue Apr 18, 2015 · 20 comments

Comments

@rufuspollock
Copy link
Member

This repo currently provides:

This proposal is about dropping the second of these (and probably merging it into the main opendefinition.org site). This is the companion issue here of this issue on the opendefinition repo okfn/opendefinition#7. We propose that we centralize discussion there.

@jonschlinkert
Copy link

I'm interested in helping with some of this. If the website and API are moved, it seems like it would make sense to split the website and API into separate initiatives since the website can be an API consumer like any other project. (I'm still catching up on the linked issues, so forgive me if I'm restating something that's already been discussed or isn't wanted)

While I'm navigating through the issues, what are your thoughts about which things need to be done first?

@mlinksva
Copy link
Contributor

Alright with me if this repo becomes dataset-only.

Assuming some people are using the API (which is really just static json) what to do about them? Apparently they exist #37

@rufuspollock
Copy link
Member Author

For people already using we will aim to do redirect and provide backwards compatibility even if API moves to opendefinition.org site

@jonschlinkert
Copy link

we will aim to do redirect and provide backwards compatibility

makes sense. I'm going to schedule some time to get familiarized with the project(s) later this week.

In the meantime, just to make sure we're thinking the same thing, how does the following sound to you:

  • simplify/refactor ofkn/licenses repo:
    • dataset-only (including necessary scripts for generating dataset)
    • refactor build/deployment scripts to javascript/node.js
    • jettison the website code
    • proper unit tests
  • move API code to separate project. Since this is a pretty simple script, IMHO this should be maintained separately from the website code to decrease barriers for contributing to each repo, and keep issues more focused. e.g. devs that want to do a patch for the API code won't need to worry about messing with HTML/CSS and vice-versa
  • update the website, which would be almost entirely design-oriented if the API code is moved out.

All of this should be done on branches in the respective repos so that once the work is complete we can deploy everything simultaneously with as little impact to end-uses as possible. this all seems pretty easy but I might be missing some important details.

sound close/reasonable?

@rufuspollock
Copy link
Member Author

We won't need unit tests if data only. In terms of API and website I think we either want to move to opendefinition repo or still keep here but perhaps just in the gh-pages branch.

Generally sounds very good. Can you also sync in #44

@rufuspollock
Copy link
Member Author

@jonschlinkert let's move forward here as per your suggestions. My sense is that we are going to strip this repo down to being just the data and data package - we might just include a script for generating json versions of the data but i'm not sure ... We then have two options

  • we fork this repo to e.g. "licenses-api". this could still run the licenses.opendefinition.org site
  • or we merge into the opendefinition repo and site http://opendefinition.org/ (see description of this issue above)

@rufuspollock
Copy link
Member Author

@jonschlinkert do you have capacity to contribute here? It would be great to move this forward and complete :-)

@anseljh
Copy link

anseljh commented Nov 30, 2015

I've been following along for a while, and may be able to help out with the dataset-only repo. I'm a data nerd and open source lawyer, know Python, and have been dabbling in NodeJS/npm recently.

Seems like Gulp might be a good fit for retooling the build scripts, although I admit I haven't 100% wrapped my brain around it it. See, for example, the json-combine plugin.

@jonschlinkert
Copy link

@rgrp sorry I've been focused on raising funding quite a bit recently. perhaps @anseljh and I can collaborate in some capacity.

@anseljh if you want to take the ball and run with it, I'd be happy to help review or contribute to whatever you come up with. I have experience with node.js and gulp, so I can at least try to answer any questions you might have

@anseljh
Copy link

anseljh commented Nov 30, 2015

@jonschlinkert Sounds good. I will start tinkering and keep you all posted. Shall we break this into a handful of more discrete issues?

@rufuspollock
Copy link
Member Author

@anseljh sounds good. Do you want to take an initial stab based on your understanding so far and then we refine?

anseljh added a commit to anseljh/licenses that referenced this issue Dec 2, 2015
@rufuspollock
Copy link
Member Author

@anseljh how's it going? Want to get clear on the plan?

@akuckartz
Copy link

Let the data be the API.

@anseljh
Copy link

anseljh commented Dec 10, 2015

Hello! @rgrp I did get started, but haven't had time to continue yet. Hopefully over the weekend I will have some time.

If you want to take a peek, my work in progress is in the dataset-only branch on my fork: https://github.com/anseljh/licenses/tree/dataset-only

@jonschlinkert
Copy link

@anseljh that looks like a great start! @doowb and I are the maintainers of gulp-convert if you have any questions with the conversions

@rufuspollock
Copy link
Member Author

@anseljh great - would it be wroth you quickly listing out here in a comment your "plan of attack" and your planned endpoint - that will make sure we are all on the same page :-)

@mlinksva
Copy link
Contributor

@Stephen-Gates
Copy link
Contributor

Given that there is now an up to date dataset via #57, what should be the next step? I was thinking that changes are maintained via the csv and the json generated when the csv changes. Thoughts?

@rufuspollock
Copy link
Member Author

@Stephen-Gates totally agree that csv should be central and then json is generated. I think pushing this to datahub.io (as that will autogenerate json - though we may also want to make sure we don't break any links here ...).

@Stephen-Gates
Copy link
Contributor

Here's my suggestion to progress this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants