You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Focus on performance, not on reliability ( what if a server crashes ? )
Distribution is implicit, hidden from the user
So Dask can be useful for intuitively distributing code, inside one of zmp Node. The distribution is transparent for the user.
This helps define pyzmp targets :
pythonic
fully distributed
reliability focused
explicit distribution
Dask seems therefore complementary, in the sense that it could fit in one zmp "coprocess" to compute heavy tasks. On the other hand pyzmp could eventually be used to distribute the dask scheduler.
However it remains to be seen if the way dask defines and handle compute task cannot help us define some API for pyzmp usage...
The text was updated successfully, but these errors were encountered:
https://www.youtube.com/watch?v=mjQ7tCQxYFQ
https://github.com/dask/dask
Dask seems quite interesting...
Pros - Python Distributed Task scheduler :
Cons - Dask does parallelism, not distribution :
So Dask can be useful for intuitively distributing code, inside one of zmp Node. The distribution is transparent for the user.
This helps define pyzmp targets :
Dask seems therefore complementary, in the sense that it could fit in one zmp "coprocess" to compute heavy tasks. On the other hand pyzmp could eventually be used to distribute the dask scheduler.
However it remains to be seen if the way dask defines and handle compute task cannot help us define some API for pyzmp usage...
The text was updated successfully, but these errors were encountered: