You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Can't figure out how to use images2zarr conversion. I get the following error, I guess it is related to packages versions but I am not sure what is required
~\Anaconda3\envs\fastai\lib\site-packages\dask\base.py in compute(self, **kwargs)
154 This turns a lazy Dask collection into its in-memory equivalent.
155 For example a Dask.array turns into a :func:numpy.array and a Dask.dataframe
--> 156 turns into a Pandas dataframe. The entire dataset must fit into memory
157 before calling this operation.
158
~\Anaconda3\envs\fastai\lib\site-packages\dask\base.py in compute(*args, **kwargs)
393
394 Parameters
--> 395 ----------
396 args : object
397 Any number of objects. If it is a dask object, it's computed and the
~\Anaconda3\envs\fastai\lib\site-packages\dask\multiprocessing.py in get(dsk, keys, num_workers, func_loads, func_dumps, optimize_graph, **kwargs)
189 loads = func_loads or config.get("func_loads", None) or _loads
190 dumps = func_dumps or config.get("func_dumps", None) or _dumps
--> 191
192 # Note former versions used a multiprocessing Manager to share
193 # a Queue between parent and workers, but this is fragile on Windows
~\Anaconda3\envs\fastai\lib\site-packages\dask\local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs)
492 res, worker_id = loads(res_info)
493 state["cache"][key] = res
--> 494 finish_task(dsk, key, state, results, keyorder.get)
495 for f in posttask_cbs:
496 f(key, res, dsk, state, worker_id)
Can't figure out how to use images2zarr conversion. I get the following error, I guess it is related to packages versions but I am not sure what is required
dask==2.3.0
zarr==2.3.2
TypeError Traceback (most recent call last)
in
----> 1 images2zarr.convert_all(str(PATH), str(Path('D:/recursion')), str(PATH))
C:\StudioProjects\Recursion\rxrx1-utils\rxrx\preprocess\images2zarr.py in convert_all(raw_images, dest_path, metadata)
43 sites = metadata_df[['dataset', 'experiment', 'plate', 'well', 'site']].to_dict(orient='rows')
44 bag = dask.bag.from_sequence(sites)
---> 45 bag.map(convert_to_zarr(raw_images, dest_path)).compute()
46
47
~\Anaconda3\envs\fastai\lib\site-packages\dask\base.py in compute(self, **kwargs)
154 This turns a lazy Dask collection into its in-memory equivalent.
155 For example a Dask.array turns into a :func:
numpy.array
and a Dask.dataframe--> 156 turns into a Pandas dataframe. The entire dataset must fit into memory
157 before calling this operation.
158
~\Anaconda3\envs\fastai\lib\site-packages\dask\base.py in compute(*args, **kwargs)
393
394 Parameters
--> 395 ----------
396 args : object
397 Any number of objects. If it is a dask object, it's computed and the
~\Anaconda3\envs\fastai\lib\site-packages\dask\multiprocessing.py in get(dsk, keys, num_workers, func_loads, func_dumps, optimize_graph, **kwargs)
189 loads = func_loads or config.get("func_loads", None) or _loads
190 dumps = func_dumps or config.get("func_dumps", None) or _dumps
--> 191
192 # Note former versions used a multiprocessing Manager to share
193 # a Queue between parent and workers, but this is fragile on Windows
~\Anaconda3\envs\fastai\lib\site-packages\dask\local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs)
492 res, worker_id = loads(res_info)
493 state["cache"][key] = res
--> 494 finish_task(dsk, key, state, results, keyorder.get)
495 for f in posttask_cbs:
496 f(key, res, dsk, state, worker_id)
TypeError: init() missing 3 required positional arguments: 'node_def', 'op', and 'message'
The text was updated successfully, but these errors were encountered: