-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot use cachier
with tqdm.contrib.concurrent.process_map
or thread_map
#92
Comments
cachier
with from tqdm.contrib.concurrent import process_map
or thread_map
cachier
with tqdm.contrib.concurrent.process_map
or thread_map
Could you please share a simplified code example? :) |
Hi, back with some code example: With pandarallel (pandas in parallel): """Test pandarallel with cachier."""
from cachier import cachier
import pandas as pd
from pandarallel import pandarallel
@cachier(stale_after=86400)
def _worker(x):
return x + 1
def worker(x):
return _worker(x)
def main():
"""Main function."""
df = pd.DataFrame({"x": range(100)})
pandarallel.initialize(progress_bar=True)
df["y"] = df["x"].parallel_apply(worker)
print(df)
if __name__ == "__main__":
# _worker.clear_cache()
main() The first time, it runs fine. But if I relaunch the script just after (hopefully with the cache), I get error: If I replace |
With tqdm.process_map now: """Test tqdm.process_map with cachier."""
from cachier import cachier
from tqdm.contrib.concurrent import process_map
@cachier(stale_after=86400)
def _worker(x):
return x + 1
def worker(x):
return _worker(x)
def main():
"""Main function."""
data = list(range(100))
result = process_map(worker, data, max_workers=4)
print(result)
if __name__ == "__main__":
# _worker.clear_cache()
main() First time, it runs fine. |
@NicolasMICAUX not sure what am I doing now, but it does not run for me... |
Hi,
When using
@cachier
with tqdm utility for multiprocessing or multithreading, cachier always get stuck in what seems to be an infinite loop.I tried playing with the decorator args, but without success.
The text was updated successfully, but these errors were encountered: