Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update QUEST and GenQuery classes for argo integration #441

Merged
merged 100 commits into from
Sep 25, 2023
Merged

Conversation

JessicaS11
Copy link
Member

@JessicaS11 JessicaS11 commented Aug 30, 2023

With lots of ongoing work to integrate argo into quest (see #427), some updates were needed to the Quest and GenQuery classes. This PR should be merged before #427:

  • moves spatial and temporal properties from Query into GenQuery so that they are accessible for all submodules (and not just ICESat-2 Query instances)
  • clarifies that quest/dataset_scripts/dataset.py is a template class for adding new datasets
  • sets up the QUEST class to add each dataset class currently/soon to be available and search and download data for them (the search+download functions have if statements to handle icesat2 versus other datasets because the initial icepyx.core.query class used different method names; the dataset template requires all future datasets added use the same methods)
  • adds docs around what datasets are currently available using quest

TO DO:

  • the quest class needs docstrings
  • may need to add some quest functions to the API docs

kelseybisson and others added 30 commits February 23, 2022 09:26
Download the 'classic' argo data with physical variables only
Download the 'classic' argo data with physical variables only
@github-actions
Copy link

github-actions bot commented Aug 30, 2023

Binder 👈 Launch a binder notebook on this branch for commit 1f3f3da

I will automatically update this comment whenever this PR is modified

Binder 👈 Launch a binder notebook on this branch for commit 866cb48

Binder 👈 Launch a binder notebook on this branch for commit 26d7930

Binder 👈 Launch a binder notebook on this branch for commit 2467e4f

Binder 👈 Launch a binder notebook on this branch for commit 0a8f558

Binder 👈 Launch a binder notebook on this branch for commit 0de8716

Binder 👈 Launch a binder notebook on this branch for commit 0a9c81c

@JessicaS11
Copy link
Member Author

@zachghiaccio @kelseybisson please add your additional docs and docstrings work to this shared_search branch/PR, not the argo one!

@JessicaS11 JessicaS11 changed the title update QUEST and GenQuery classes argo integration update QUEST and GenQuery classes for argo integration Aug 30, 2023
kelseybisson and others added 6 commits September 6, 2023 09:39
Added Docstrings to functions within quest.py and edited the primary docstring for the QUEST class here.

Note I did not add Docstrings to the implicit __self__ function.
Added comments (not Docstrings) to test functions
Minor edits to the doc strings
Edited docstrings
# Conflicts:
#	icepyx/core/query.py
#	icepyx/quest/dataset_scripts/argo.py
#	icepyx/quest/dataset_scripts/dataset.py
#	icepyx/quest/quest.py
@RomiP RomiP merged commit 1d53341 into development Sep 25, 2023
3 checks passed
@RomiP RomiP deleted the shared_search branch September 25, 2023 15:45
JessicaS11 added a commit that referenced this pull request Nov 15, 2023
* Adding argo search and download script

* Create get_argo.py

Download the 'classic' argo data with physical variables only

* begin implementing argo dataset

* 1st draft implementing argo dataset

* implement search_data for physical argo

* doctests and general cleanup for physical argo query

* beginning of BGC Argo download

* parse BGC profiles into DF

* plan to query BGC profiles

* validate BGC param input function

* order BGC params in order in which they should be queried

* fix bug in parse_into_df() - init blank df to take in union of params from all profiles

* identify profiles from initial API request containing all required params

* creates df with only profiles that contain all user specified params
Need to dload additional params

* modified to populate prof df by querying individual profiles

* finished up BGC argo download!

* assert bounding box type in Argo init, begin framework for unit tests

* Adding argo search and download script

* Create get_argo.py

Download the 'classic' argo data with physical variables only

* begin implementing argo dataset

* 1st draft implementing argo dataset

* implement search_data for physical argo

* doctests and general cleanup for physical argo query

* beginning of BGC Argo download

* parse BGC profiles into DF

* plan to query BGC profiles

* validate BGC param input function

* order BGC params in order in which they should be queried

* fix bug in parse_into_df() - init blank df to take in union of params from all profiles

* identify profiles from initial API request containing all required params

* creates df with only profiles that contain all user specified params
Need to dload additional params

* modified to populate prof df by querying individual profiles

* finished up BGC argo download!

* assert bounding box type in Argo init, begin framework for unit tests

* need to confirm spatial extent is bbox

* begin test case for available profiles

* add tests for argo.py

* add typing, add example json, and use it to test parsing

* update argo to submit successful api request (update keys and values submitted)

* first pass at porting argo over to metadata+per profile download (WIP)

* basic working argo script

* simplify parameter validation (ordered list no longer needed)

* add option to delete existing data before new download

* continue cleaning up argo.py

* fix download_by_profile to properly store all downloaded data

* remove old get_argo.py script

* remove _filter_profiles function in favor of submitting data kwarg in request

* start filling in docstrings

* clean up nearly duplicate functions

* add more docstrings

* get a few minimal argo tests working

* add bgc argo params. begin adding merge for second download runs

* some changes

* WIP test commit to see if can push to GH

* WIP handling argo merge issue

* update profile to df to return df and move merging to get_dataframe

* merge profiles with existing df

* clean up docstrings and code

* add test_argo.py

* add prelim test case for adding to Argo df

* remove sandbox files

* remove bgc argo test file

* update variables notebook from development

* simplify import statements

* quickfix for granules error

* draft subpage on available QUEST datasets

* small reference fix in text

* add reference to top of .rst file

* test argo df merge

* add functionality to Quest class to pass search criteria to all datasets

* add functionality to Quest class to pass search criteria to all datasets

* update dataset docstrings; reorder argo.py to match

* implement quest search+download for IS2

* move spatial and temporal properties from query to genquery

* add query docstring test for cycles,tracks to test file

* add quest test module

* standardize print outputs for quest search and download; is2 download needs auth updates

* remove extra files from this branch

* comment out argo portions of quest for PR

* remove argo-branch-only init file

* remove argo script from branch

* remove argo test file from branch

* comment out another line of argo stuff

* Update quest.py

Added Docstrings to functions within quest.py and edited the primary docstring for the QUEST class here.

Note I did not add Docstrings to the implicit __self__ function.

* Update test_quest.py

Added comments (not Docstrings) to test functions

* Update dataset.py

Minor edits to the doc strings

* Update quest.py

Edited docstrings

* catch error with downloading datasets in Quest; template test case for multi dataset query

---------

Co-authored-by: Kelsey Bisson <[email protected]>
Co-authored-by: Romina <[email protected]>
Co-authored-by: zachghiaccio <[email protected]>
Co-authored-by: Zach Fair <[email protected]>
JessicaS11 added a commit that referenced this pull request Jan 5, 2024
* Adding argo search and download script

* Create get_argo.py

Download the 'classic' argo data with physical variables only

* begin implementing argo dataset

* 1st draft implementing argo dataset

* implement search_data for physical argo

* doctests and general cleanup for physical argo query

* beginning of BGC Argo download

* parse BGC profiles into DF

* plan to query BGC profiles

* validate BGC param input function

* order BGC params in order in which they should be queried

* fix bug in parse_into_df() - init blank df to take in union of params from all profiles

* identify profiles from initial API request containing all required params

* creates df with only profiles that contain all user specified params
Need to dload additional params

* modified to populate prof df by querying individual profiles

* finished up BGC argo download!

* assert bounding box type in Argo init, begin framework for unit tests

* Adding argo search and download script

* Create get_argo.py

Download the 'classic' argo data with physical variables only

* begin implementing argo dataset

* 1st draft implementing argo dataset

* implement search_data for physical argo

* doctests and general cleanup for physical argo query

* beginning of BGC Argo download

* parse BGC profiles into DF

* plan to query BGC profiles

* validate BGC param input function

* order BGC params in order in which they should be queried

* fix bug in parse_into_df() - init blank df to take in union of params from all profiles

* identify profiles from initial API request containing all required params

* creates df with only profiles that contain all user specified params
Need to dload additional params

* modified to populate prof df by querying individual profiles

* finished up BGC argo download!

* assert bounding box type in Argo init, begin framework for unit tests

* need to confirm spatial extent is bbox

* begin test case for available profiles

* add tests for argo.py

* add typing, add example json, and use it to test parsing

* update argo to submit successful api request (update keys and values submitted)

* first pass at porting argo over to metadata+per profile download (WIP)

* basic working argo script

* simplify parameter validation (ordered list no longer needed)

* add option to delete existing data before new download

* continue cleaning up argo.py

* fix download_by_profile to properly store all downloaded data

* remove old get_argo.py script

* remove _filter_profiles function in favor of submitting data kwarg in request

* start filling in docstrings

* clean up nearly duplicate functions

* add more docstrings

* get a few minimal argo tests working

* add bgc argo params. begin adding merge for second download runs

* some changes

* WIP test commit to see if can push to GH

* WIP handling argo merge issue

* update profile to df to return df and move merging to get_dataframe

* merge profiles with existing df

* clean up docstrings and code

* add test_argo.py

* add prelim test case for adding to Argo df

* remove sandbox files

* remove bgc argo test file

* update variables notebook from development

* simplify import statements

* quickfix for granules error

* draft subpage on available QUEST datasets

* small reference fix in text

* add reference to top of .rst file

* test argo df merge

* add functionality to Quest class to pass search criteria to all datasets

* add functionality to Quest class to pass search criteria to all datasets

* update dataset docstrings; reorder argo.py to match

* implement quest search+download for IS2

* move spatial and temporal properties from query to genquery

* add query docstring test for cycles,tracks to test file

* add quest test module

* standardize print outputs for quest search and download; is2 download needs auth updates

* remove extra files from this branch

* comment out argo portions of quest for PR

* remove argo-branch-only init file

* remove argo script from branch

* remove argo test file from branch

* comment out another line of argo stuff

* Update quest.py

Added Docstrings to functions within quest.py and edited the primary docstring for the QUEST class here.

Note I did not add Docstrings to the implicit __self__ function.

* Update test_quest.py

Added comments (not Docstrings) to test functions

* Update dataset.py

Minor edits to the doc strings

* Update quest.py

Edited docstrings

* catch error with downloading datasets in Quest; template test case for multi dataset query

---------

Co-authored-by: Kelsey Bisson <[email protected]>
Co-authored-by: Romina <[email protected]>
Co-authored-by: zachghiaccio <[email protected]>
Co-authored-by: Zach Fair <[email protected]>
JessicaS11 added a commit that referenced this pull request Jan 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants