Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update README.md: Ensembles #59

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Conversation

Zacmce
Copy link
Collaborator

@Zacmce Zacmce commented Feb 5, 2025

Updated to reflect the newly added ensemble capability, wherein the output of the LSTM module is the ensemble mean forecast of the output from multiple trained LSTMs. I made relevant changes through the "Trained LSTM Model" section, but the below sections on Dependencies thru Unit Test I did not change (yet), but I think that much of it will be unchanged. I had a few questions though:

I think we are using AORC forcings, but I do not see those stored in the "Data" folder despite the data section of the readme saying the necessary data lives there. I am not sure where the AORC data lives (in Jonathan's fork of LSTM which is more updated it appears to come from another repo entirely?). Please advise on where I can point a user to the relevant forcings data.

Once the final PR is completed to merge jmframe/lstm onto owp/lstm, I will want to do another update of the Readme to reflect the latest links so I can point the reader to e.g. the most recent notebooks and files, that do not currently exist on owp/lstm because the merge has not been fully completed yet.

Checklist

  • PR has an informative and human-readable title
  • Changes are limited to a single goal (no scope creep)
  • Code can be automatically merged (no conflicts)
  • Code follows project standards (link if applicable)
  • Passes all existing automated tests
  • Any change in functionality is tested
  • New functions are documented (with a description, list of inputs, and expected output)
  • Placeholder code is flagged / future todos are captured in comments
  • Visually tested in supported browsers and devices (see checklist below 👇)
  • Project documentation has been updated (including the "Unreleased" section of the CHANGELOG)
  • Reviewers requested with the Reviewers tool ➡️

Testing checklist

Target Environment support

  • Windows
  • Linux
  • Browser

Accessibility

  • Keyboard friendly
  • Screen reader friendly

Other

  • Is useable without CSS
  • Is useable without JS
  • Flexible from small to large screens
  • No linting errors or warnings
  • JavaScript tests are passing

Updated to reflect the newly added ensemble capability, wherein the output of the LSTM module is the ensemble mean forecast of the output from multiple trained LSTMs. I made relevant changes through the "Trained LSTM Model" section, but the below sections on Dependencies thru Unit Test I did not change (yet), but I think that much of it will be unchanged. I had a few questions though: 

I think we are using AORC forcings, but I do not see those stored in the "Data" folder despite the data section of the readme saying the necessary data lives there. I am not sure where the AORC data lives (in Jonathan's fork of LSTM which is more updated it appears to come from another repo entirely?). Please advise on where I can point a user to the relevant forcings data.
@Zacmce
Copy link
Collaborator Author

Zacmce commented Feb 5, 2025

@aaraney

@jmframe
Copy link
Contributor

jmframe commented Feb 6, 2025

Hi @Zacmce , the PR I made does include updated readme with changes for the examples, including instructions on accessing the example AORC data from this link. Below is a link to my updated readme, and below that is a screen shot of the changed lines. But note that @aaraney also has a PR to do the ensembling in the different manner we talked about last week.

https://github.com/jmframe/lstm/blob/c3a93c8368b9adcb695dd07e62484436aee4cd9a/README.md?plain=1#L17

Screenshot 2025-02-06 at 1 45 44 PM

@Zacmce
Copy link
Collaborator Author

Zacmce commented Feb 6, 2025

@jmframe ah I see. Thanks for bringing to my attention! Austin asked if I could do a readme update but perhaps he was referring to his PR not yours... And the documentation I added was outlining the implementation that we did the other day reflected in your PR.

Don't a few other sections need to be updated in the readme though to reflect the ensemble capability and overall current implementation? Like the Trained LSTM Models section, where we no longer have _all_attributes etc.

@aaraney
Copy link
Member

aaraney commented Feb 6, 2025

Im sorry @jmframe and @Zacmce! I didn't communicate to @jmframe that i'd asked @Zacmce to make a PR to update the readme. To @Zacmce's point, can you both collaborate on this PR to update the readme as #56 will not be merged?

For clarity, #56 will not be merged but is still a great place for conversation. All the content (code, etc.) in #56 is still included in #57. Meaning the commits from #56 have been condensed into a single commit and then overwritten by my contributions but the history is still tracked by git.

@jmframe
Copy link
Contributor

jmframe commented Feb 6, 2025

@jmframe ah I see. Thanks for bringing to my attention! Austin asked if I could do a readme update but perhaps he was referring to his PR not yours... And the documentation I added was outlining the implementation that we did the other day reflected in your PR.

Don't a few other sections need to be updated in the readme though to reflect the ensemble capability and overall current implementation? Like the Trained LSTM Models section, where we no longer have _all_attributes etc.

Yes, absolutely right. I've updated my README now to reflect those trained models:
https://github.com/jmframe/lstm/blob/master/README.md

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants