Replies: 2 comments 6 replies
-
The input HLS data to our DSWX-HLS algorithm is already in Cloud Optimized Geotiff (COG) format. They are also multiple files (one file per band). So it will make it really easy to pull only the bands (files) that our algorithm needs. It turned out that the DSWX-HLS algorithm actually does not need all the bands of HLS. The HLS data on average are ~300 MB per tile. and We need to pull only ~ 160MB per tile, so right there we will have a significant reduction in the amount of data that we need to move to the instance that we want to run our SAS from. As I mentioned the HLS data are relatively small. In my opinion for this product we don't need to pull spatial subsets of a given tile. We will pul the entire tile but we pull only the bands (i.e., files) that we need. Let me know if that is clear or you need more info. |
Beta Was this translation helpful? Give feedback.
-
Hi @hfattahi - thank you for that clarification. Is the following summary accurate?
If so, here's what I'm getting as implications:
|
Beta Was this translation helpful? Give feedback.
-
How does the use of cloud-optimized formats for HLS input data products affect the PCM? Specifically, for a given input data product available in HDF5 format, that is also available in a cloud-optimized format (cloud-optimized geotiff) enabling pulling of only specific spatial subsets of a given tile, what are the implications to:
@hfattahi @collinss-jpl @hhlee445 - thoughts?
Beta Was this translation helpful? Give feedback.
All reactions