You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I am a bit confused in the training part using meshes. I am trying to use the normal map as a parameter for supervision for guiding the loss function. My goal would be to build a dataset from .obj files and use the renderer for creating the images, as well as their normal map.
You mentioned in #85 that normal map was directly derived from the .obj file for training, but I am wondering how the supervision could work if the normal map derived from the renderer is smoothed but the one from the .obj file is fine. If I am rendering the normal map of a cube I am getting something like that:
When I would normally expect the normal map render to look like that:
Not sure if I am missing something...
The text was updated successfully, but these errors were encountered:
Hi! I am a bit confused in the training part using meshes. I am trying to use the normal map as a parameter for supervision for guiding the loss function. My goal would be to build a dataset from .obj files and use the renderer for creating the images, as well as their normal map.
You mentioned in #85 that normal map was directly derived from the .obj file for training, but I am wondering how the supervision could work if the normal map derived from the renderer is smoothed but the one from the .obj file is fine. If I am rendering the normal map of a cube I am getting something like that:
When I would normally expect the normal map render to look like that:
Not sure if I am missing something...
The text was updated successfully, but these errors were encountered: