-
-
Notifications
You must be signed in to change notification settings - Fork 449
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Issue]: FLUX qint load failed #3673
Comments
please try to reproduce using latest version, it was just released recently and has some relevant fixes. |
Now it stop working without any error on flux load. And previously debug messages was disabled by default. Happy New Year!
|
After I install SD.Next from scratch, remove previously downloaded model and download it again, I get same error again.
|
Not related, but with a 3000-series GPU, you shouldn't be using |
@brknsoul |
just dont use |
Issue Description
Run SD.Next with
--use-cuda --use-xformers --models-dir e:\Models
I setup FLUX.1-dev-qint8 [fd65655d4d] using model selection dialog.
Set "Model Offloading" to "model", other settings unchanged
If i try to run model with any promt i Get next error
Version Platform Description
Python: version=3.10.6 platform=Windows
Version: app=sd.next updated=2024-12-24 hash=451eeab1 branch=master
url=https://github.com/vladmandic/automatic/tree/master ui=main
Platform: arch=AMD64 cpu=Intel64 Family 6 Model 165 Stepping 3, GenuineIntel system=Windows
release=Windows-10-10.0.19045-SP0 python=3.10.6 docker=False
Extensions: enabled=['Lora', 'sd-extension-chainner', 'sd-extension-system-info',
'sd-webui-agent-scheduler', 'sdnext-modernui', 'stable-diffusion-webui-rembg']
Device detect: memory=12.0 optimization=balanced
Engine: backend=Backend.DIFFUSERS compute=cuda device=cuda attention="xFormers" mode=no_grad
Torch parameters: backend=cuda device=cuda config=Auto dtype=torch.bfloat16 vae=torch.bfloat16
unet=torch.bfloat16 context=no_grad nohalf=False nohalfvae=False upcast=False
deterministic=False test-fp16=True test-bf16=True optimization="xFormers"
Device: device=NVIDIA GeForce RTX 3060 n=1 arch=sm_90 capability=(8, 6) cuda=12.4 cudnn=90100
driver=560.94
Torch: torch==2.5.1+cu124 torchvision==0.20.1+cu124
Packages: diffusers==0.33.0.dev0 transformers==4.47.1 accelerate==1.2.1 gradio==3.43.2
Relevant log output
No response
Backend
Diffusers
UI
Standard
Branch
Master
Model
FLUX.1
Acknowledgements
The text was updated successfully, but these errors were encountered: