Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lack of support of loading lora weights in PixArtAlphaPipeline #9887

Open
DaaadShot opened this issue Nov 8, 2024 · 2 comments
Open

lack of support of loading lora weights in PixArtAlphaPipeline #9887

DaaadShot opened this issue Nov 8, 2024 · 2 comments

Comments

@DaaadShot
Copy link

pipe = PixArtAlphaPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe.load_lora_weights("xxx")
When I want to load lora in PixArtAlphaPipeline, it throws this error:
AttributeError: 'PixArtAlphaPipeline' object has no attribute 'load_lora_weights'

May be we can add the lora support in this pipeline?
Using the peft method like this is not convenient, and the effect is not good...
transformer = Transformer2DModel.from_pretrained("PixArt-alpha/PixArt-LCM-XL-2-1024-MS", subfolder="transformer", torch_dtype=torch.float16)
transformer = PeftModel.from_pretrained(transformer, "Your-LoRA-Model-Path")

@DN6
Copy link
Collaborator

DN6 commented Nov 8, 2024

cc: @sayakpaul

@sayakpaul
Copy link
Member

Is there any popular LoRA on PixArt?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants