You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thought it might be useful to start a discussion here.
Using the NGD API as an example, from what I understand it reads directly from a API endpoint using GeoJSON files.
This works fine for small areas, but for larger areas it can take awhile.
I appreciate we could create a custom OS Select+Build recipe on the data hub, but that can be quite time consuming if someone is needing to quickly query various areas.
Just wondering if there's any work underway to support GeoParquet reads directly from cloud storage?
More than happy to help contribute, for example I've got a few approaches for spatially partitioning large datasets using a mixture of DuckDB, Rust/Python, PostGIS in combination with GDAL.
The text was updated successfully, but these errors were encountered:
Thought it might be useful to start a discussion here.
Using the
NGD
API as an example, from what I understand it reads directly from a API endpoint using GeoJSON files.This works fine for small areas, but for larger areas it can take awhile.
I appreciate we could create a custom OS Select+Build recipe on the data hub, but that can be quite time consuming if someone is needing to quickly query various areas.
Just wondering if there's any work underway to support GeoParquet reads directly from cloud storage?
More than happy to help contribute, for example I've got a few approaches for spatially partitioning large datasets using a mixture of DuckDB, Rust/Python, PostGIS in combination with GDAL.
The text was updated successfully, but these errors were encountered: