Skip to content

Commit

Permalink
Don't reset dynamo on upcaler compile
Browse files Browse the repository at this point in the history
  • Loading branch information
Disty0 committed Dec 31, 2024
1 parent 945c57c commit 8f35f09
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion modules/upscaler.py
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,6 @@ def compile_upscaler(model):
try:
if "Upscaler" in shared.opts.cuda_compile and shared.opts.cuda_compile_backend != 'none':
import torch._dynamo # pylint: disable=unused-import,redefined-outer-name
torch._dynamo.reset() # pylint: disable=protected-access
if shared.opts.cuda_compile_backend not in torch._dynamo.list_backends(): # pylint: disable=protected-access
shared.log.warning(f"Upscaler compile not available: backend={shared.opts.cuda_compile_backend} available={torch._dynamo.list_backends()}") # pylint: disable=protected-access
return model
Expand Down

0 comments on commit 8f35f09

Please sign in to comment.