You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
pipe = PixArtAlphaPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe.load_lora_weights("xxx")
When I want to load lora in PixArtAlphaPipeline, it throws this error:
AttributeError: 'PixArtAlphaPipeline' object has no attribute 'load_lora_weights'
May be we can add the lora support in this pipeline?
Using the peft method like this is not convenient, and the effect is not good...
transformer = Transformer2DModel.from_pretrained("PixArt-alpha/PixArt-LCM-XL-2-1024-MS", subfolder="transformer", torch_dtype=torch.float16)
transformer = PeftModel.from_pretrained(transformer, "Your-LoRA-Model-Path")
The text was updated successfully, but these errors were encountered:
pipe = PixArtAlphaPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe.load_lora_weights("xxx")
When I want to load lora in PixArtAlphaPipeline, it throws this error:
AttributeError: 'PixArtAlphaPipeline' object has no attribute 'load_lora_weights'
May be we can add the lora support in this pipeline?
Using the peft method like this is not convenient, and the effect is not good...
transformer = Transformer2DModel.from_pretrained("PixArt-alpha/PixArt-LCM-XL-2-1024-MS", subfolder="transformer", torch_dtype=torch.float16)
transformer = PeftModel.from_pretrained(transformer, "Your-LoRA-Model-Path")
The text was updated successfully, but these errors were encountered: