Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multi controlnet error for flux when using 2 controlnet with different layer length #9911

Open
PromeAIpro opened this issue Nov 12, 2024 · 3 comments
Labels
bug Something isn't working

Comments

@PromeAIpro
Copy link
Contributor

PromeAIpro commented Nov 12, 2024

Describe the bug

in flux multicontrolnet when i using 2 controlnet(https://huggingface.co/promeai/FLUX.1-controlnet-lineart-promeai and https://huggingface.co/InstantX/FLUX.1-dev-Controlnet-Canny/blob/main/config.json)
the lineart controlnet has 4 double layers and the canny controlnet has 5 double layers, we think the following code will have a negative impact on the effect.
image
image
Because in the transformer calculation, we directly took the length, and due to the calculation logic in Figure 1, the length of prome should be 4, but it was classified into the column with a length of 5.

Reproduction

import torch
from diffusers.utils import load_image
from diffusers.pipelines.flux.pipeline_flux_controlnet import FluxControlNetPipeline
from diffusers.models.controlnet_flux import FluxControlNetModel, FluxMultiControlNetModel

base_model = 'black-forest-labs/FLUX.1-dev'

load controlnet models

controlnet_model_canny = 'InstantX/FLUX.1-dev-Controlnet-Canny'
controlnet_canny = FluxControlNetModel.from_pretrained(controlnet_model_canny, torch_dtype=torch.bfloat16)
controlnet_model_lineart = 'promeai/FLUX.1-controlnet-lineart-promeai'
controlnet_lineart = FluxControlNetModel.from_pretrained(controlnet_model_lineart, torch_dtype=torch.bfloat16)

controlnet_canny_lineart = FluxMultiControlNetModel([controlnet_canny, controlnet_lineart])

pipe = FluxControlNetPipeline.from_pretrained(base_model, controlnet=controlnet_canny_lineart, torch_dtype=torch.bfloat16)
pipe.to("cuda")

control_image_canny = load_image("one canny image")
control_image_lineart = load_image("one lineart image")

prompt = "A girl in city, 25 years old, cool, futuristic"
image = pipe(
prompt,
control_image=[control_image_canny, control_image_lineart],
controlnet_conditioning_scale=[0.6, 0.6],
num_inference_steps=28,
guidance_scale=3.5,
).images[0]
image.save("image.jpg")

Logs

No response

System Info

Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.

  • 🤗 Diffusers version: 0.31.0
  • Platform: Linux-5.15.0-105-generic-x86_64-with-glibc2.31
  • Running on Google Colab?: No
  • Python version: 3.10.15
  • PyTorch version (GPU?): 2.5.1+cu124 (True)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Huggingface_hub version: 0.26.2
  • Transformers version: 4.46.2
  • Accelerate version: 1.1.1
  • PEFT version: 0.13.2
  • Bitsandbytes version: not installed
  • Safetensors version: 0.4.5
  • xFormers version: not installed
  • Accelerator: NVIDIA A100-SXM4-80GB, 81920 MiB
    NVIDIA A100-SXM4-80GB, 81920 MiB
  • Using GPU in script?:
  • Using distributed or parallel set-up in script?:

Who can help?

@sayakpaul

@PromeAIpro PromeAIpro added the bug Something isn't working label Nov 12, 2024
@PromeAIpro
Copy link
Contributor Author

PromeAIpro commented Nov 12, 2024

sure i will do the fix, I'm supposing to make controlnet return a max-up-to-<NUM_DOUBLE_LAYERS> block-samples and max-up-to-<NUM_SINGLE_BLOCKS> single-block-samples, need make controlnet has access to flux-series model config.
The above should happen only when not all controlnets have same number of layers.
need more discussing.

@sayakpaul
Copy link
Member

Cc: @yiyixuxu

@PromeAIpro
Copy link
Contributor Author

#9920 @yiyixuxu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants