Home Technical Talk

Point normals at camera in maya?

polycounter lvl 6
Offline / Send Message
Pinned
TannedBatman polycounter lvl 6
I can control the vertext normals with 'Set vertext normal' and 'Vertext normal edit tool' but I want to specifically point the normals at a location in the world. The other options just point at the horizon.

For example the vertex normal edit tool does this

But what If I want it to do this

This is so I can bake at the perspective of a player in a video game to capture detail from their perspective so the high poly looks like it makes more sense. Is this possible in maya or might I need to use another program? Thank you.

Replies

  • BlueFlytrap
    Options
    Offline / Send Message
    BlueFlytrap polycounter lvl 9
    While I cannot speak for Maya there are plugins for 3ds Max and modifier in Blender that will project vertex normals from one model to another. Typically these will take the normal from the nearest vertex of the target. If the target is an inverted sphere that means the nearest point will always face the center of that sphere.


    While this does work it is extremely slow, prone to crashing on complex models, and gives jagged imprecise results. You may also need to do it all over again if you later make changes to your model.

    My suggestion is to instead skip all that and use a custom cage. Duplicate your model, set the origin to your desired point in space, and then scale the model down to around 1%. Use that as your cage. It will give better results than projected vertex normals with far less time and effort.
    By coincidence I've needed to do the same thing as you over the past two weeks. Much of my time could have been saved had I just used a cage earlier.
  • TannedBatman
    Options
    Offline / Send Message
    TannedBatman polycounter lvl 6
    While I cannot speak for Maya there are plugins for 3ds Max and modifier in Blender that will project vertex normals from one model to another. Typically these will take the normal from the nearest vertex of the target. If the target is an inverted sphere that means the nearest point will always face the center of that sphere.


    While this does work it is extremely slow, prone to crashing on complex models, and gives jagged imprecise results. You may also need to do it all over again if you later make changes to your model.

    My suggestion is to instead skip all that and use a custom cage. Duplicate your model, set the origin to your desired point in space, and then scale the model down to around 1%. Use that as your cage. It will give better results than projected vertex normals with far less time and effort.
    By coincidence I've needed to do the same thing as you over the past two weeks. Much of my time could have been saved had I just used a cage earlier.
    This works, thank you for the help. Just wondering whats the reason for scaling the model down?
  • BlueFlytrap
    Options
    Offline / Send Message
    BlueFlytrap polycounter lvl 9
    This works, thank you for the help. Just wondering whats the reason for scaling the model down?
    When a cage is used the lowpoly vertex normals are ignored for projection and instead vertex will point towards their twin on the cage model. Since we want them all looking at the same tiny point in space we just shove the entire cage into that same spot.
    Scaling to 1% instead of 0% avoids bits of the cage getting merged together on export. Cage need the same amount of triangles and verts as their paired lowpoly or they won't work.
Sign In or Register to comment.