You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have been using Datashader with Matplotlib to create dynamic visualizations of large vectors (shapefiles). The approach works fine, but each time I start the program, Numba's functions for creating the canvas and the shaded image are compiled by Numba, resulting in a 6-7 second overhead.
I wonder if it's possible to enable caching (for instance, setting use_cache=True) in the @njit decorator for the function, and whether there is a general solution to overcome this problem.
Thank you in advance for your attention.
Best regards,
Michele Zucchelli.
The text was updated successfully, but these errors were encountered:
Hi everyone,
I have been using Datashader with Matplotlib to create dynamic visualizations of large vectors (shapefiles). The approach works fine, but each time I start the program, Numba's functions for creating the canvas and the shaded image are compiled by Numba, resulting in a 6-7 second overhead.
I wonder if it's possible to enable caching (for instance, setting use_cache=True) in the @njit decorator for the function, and whether there is a general solution to overcome this problem.
Thank you in advance for your attention.
Best regards,
Michele Zucchelli.
The text was updated successfully, but these errors were encountered: