You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been generating images using this model, which is delightfully fast, but I've noticed that it produces images that are all alike. I tried generating the "null" image by doing:
H = perceptor.encode_text(toks.to(device)).float()
z = net(0 * H)
This resulted in:
And indeed, everything I generated kind of matched that: you can see the fleshly protrusion on the left in "gold coin":
The object and matching mini-object in "tent":
And it always seems to try to caption the image with nonsense lettering ("lion"):
So I'm wondering if there's a way to "prime" the model and suggest it use a different zero image for each run. Is there a variable I can set, or is this deeply ingrained in training data?
Any advice would be appreciated, thank you!
(Apologies if this is the same as #8, but it sounded like #8 was solved by using priors which doesn't seem to help with this.)
The text was updated successfully, but these errors were encountered:
I've been generating images using this model, which is delightfully fast, but I've noticed that it produces images that are all alike. I tried generating the "null" image by doing:
This resulted in:
And indeed, everything I generated kind of matched that: you can see the fleshly protrusion on the left in "gold coin":
The object and matching mini-object in "tent":
And it always seems to try to caption the image with nonsense lettering ("lion"):
So I'm wondering if there's a way to "prime" the model and suggest it use a different zero image for each run. Is there a variable I can set, or is this deeply ingrained in training data?
Any advice would be appreciated, thank you!
(Apologies if this is the same as #8, but it sounded like #8 was solved by using priors which doesn't seem to help with this.)
The text was updated successfully, but these errors were encountered: