USD orientation leftHanded vs rightHanded

Hey everyone,

I am looking for someone who can clear up my confusion about the orientation topic.

A little background info:
We are for instance modeling assets in zbrush, blender, maya. We mostly still export alembics or objs. We ingest the geometry via Houdini (sopimports) in to USD. At some point we go from USD to maya again to animate and finally export USD geometry caches from maya back to USD.

→ Now somewhere along this process things go wrong. We noticed that if we simply import some of our USD assets in to Maya, the uvs are either scrambled or they look good but are actually not connected. Going then back to USD makes our asset not renderable, because displacements are completely off they render black or some other issue. We noticed that those issues could be fixed by turning on the “Reverse Polygon Vertex Ordering” option in the Houdini sopimport node. But I am not sure why!

First of all the basics:
This link provides som info about orientation in USD:

  • Question 1: As far as I understand orientation is an attribute that indicates the orientation of the coordinate system that the mesh was created in?

It can either be leftHanded or rightHanded.

While rightHanded is: up is +Y, right is +X, and the forward viewing direction is -Z

Correct me if I’m wrong!

  • Question 2: What does the orientation attribute actually affect?

In the link above it says:

That is, the normal computed from (e.g.) a polygon’s sequential vertices using the right handed winding rule determines the “front” or “outward” facing direction, that typically, when rendered will receive lighting calculations and shading.

  • Question 3: That means a leftHanded system has a different seqeunces of vertices and therefore calculating normals the rightHanded way wouldn’t work? So the orientation is all about an algorithm that tries to calculate normals?

Reading from this link above, I understand that the default orientation in USD is rightHanded.

But it seems that different DCCs use different orientations.
I am wondering how does this affect me as a user, especially when working with softwares that have no usd support.

  • Question 4: Do I have to know the orientation of each software the mesh was created in, and then import that mesh in to USD with the correct settings?

For instance we often first get our obj or alembic in to houdini sops, clean up some attributes, name the meshes and then we go to USD via sopsimport.

In the sopsimport we can specify to “Reverse Polygon Vertex Ordering”. The documenation says about this option:

while SOP geometry is always left-handed ordering. When this option is on, the importer always reorders vertices (and associated primvars) to be right-handed.

  • Question 5: From a workflow perspective, is it required / recommended to have all meshes in your USD stage in the rightHanded default orientation? Or is it only important that the the orientation is set correctly and renderers / viewports etc will draw and calculate accordingly?

And last but not least, to prevent the issues we have currently in the future:

  • Question 6: Whose responsibility is it to handle the orientation of meshes when moving data from different DCCs? Shouldn’t all DCCs export this information in the mesh itself and also read this attribute on import and then optionally convert the orientation if it doesn’t match their own space?

  • Question 7: How is all of this handeled when we talk about other formats like objs, alembic or fbx? Here we usually import it in to sops and then to lops. How does orientation handeling work here? Is orientation also stored in those formats or do I need to specifically set it as user?

Sorry I know these are a lot of questions but I finally want to clear up my confusion about this.

1/2/3. orientation indicates a counterclockwise by default or clockwise interpretation of vertex order when determining whether the surface normal is pointing “out” or “into” the mesh. CW and CCW change meaning according to the handedness of the coordinate system.

5/ If things are not working as expected, assuming an importer or exporter is otherwise correctly implemented, a typical cause is that negative scale, which flips the normal, has been neglected. There is no requirement of one handedness or another, it’s unambiguously indicated as described above.

6/ USD defines order and handedness unambiguously within the scene. An importer is responsible for conforming a mesh or annotating it as necessary to render properly in the dcc it supports. Similarly an exporter is responsible for either conforming a mesh to USD conventions, or annotating the scene unambiguously.

4/6/ It shouldn’t be a user’s responsibility to figure out why a mesh’s faces are oriented incorrectly after an import or an export operation. If a dcc’s exporter has not annotated the scene properly on export to USD, there is no way to know how it introduced errors without inspecting the source code or asking on a support forum.

7/ this seems like Houdini specific question, which I will leave for a Houdini expert to answer :slight_smile:

1 Like

My two cents on 7:
Not sure about the other formats on top of my head, but I’d recommend sticking to the orientation of the DCC that does most of your heavy data lifting and where import/export performance is key. If you are using Maya and Houdini, I’d stick to Houdini’s face winding orientation, because it will effect the LOP import/export times. (Since Maya to USD conversions are usually done at export time and not for a live USD preview, the export overhead is neglectable)


Hey @nporcino

Thanks for your answer. Already helped me to understand it a whole lot better!

I have some short follow up questions:

  1. Can you talk a bit more about what CW and CCW means and how it is relevant for USD?

a typical cause is that negative scale, which flips the normal, has been neglected

  1. Do you mean that the importer or exporter doesn’t take negative scale in to account and is therefore not fully implemented? Or do you mean users should have applied / froze the scale before using the exporter?

And finally:

  1. So in a USD stage there can be some primitives that are rightHanded and some that are leftHanded at the same time right?
    Let’s say I Ioad this stage natively in the usd viewer and start a render delegate. Is it the render delegates job to interpret the mesh correctly according its orientation?
    And the same goes for the openGL hydra viewer?

Hey @LucaScheller

Thanks thats actually a good point, didn’t think about that. So each sopimport that has the options “Reverse Polygon Vertex Ordering” will have a performance hit on live preview.

Is there a way to reverse the winding on export only from Houdini? I only found the reverse option on a sopimport so far. And simply changing the orientation attribute to rightHanded also won’t do the trick as you have to change the actual order of certain attributes as well right?

But if I understood @nporcino correctly I shouldn’t have to do any orientation conversions as the maya importer should be smart enough to handle it. So maybe it actually has a bug and destroys our uvs because of that.

Clockwise and Counterclockwise aren’t relevant to USD per se, but rather to anything that needs to understand whether a polygon faces “inwards” or “outwards” with respect to (1) the handedness of the coordinate system, and (2) the winding of the face.

So that means it’s relevant to rendering, mesh editing, and so on. USD just carries the data to the place where it’s needed.

An infrequent error in importers is that they don’t take negative scale into account. The usual case you see this is that someone made a symmetrical face by mirroring one half to the other, and upon import to a Dcc, the reflected side is unshaded and simply black.

3/ yes. When the render delegate reads the mesh data, its responsibility is to orient the faces as it needs. For example, if you are in OpenGL and you’ve set winding CCW, and the mesh is CW, you need to reverse the order of the indices.

In order to progress with your question about texture coordinates, I have an idea.

Over here: is a simple asset for testing texture coordinates. The idea is to modify to take this asset through pipeline, and see if your problem is demonstrated. That will make it easier to diagnose what is going on. If it doesn’t show your problem, then the asset could be modified slightly, perhaps by subdividing the mesh on level to generate more quads, changing the winding, and so on. Another advantage of that asset, is that there are a bunch of people in the ASWF wg-usd-assets Slack who would probably jump at a chance to make that asset more useful for testing and diagnosis.

You could trigger it to be only active on export via context options for example, but I don’t think that’s the way to go, because your live exports need to/should always match 1:1 with what is exported on disk.

1 Like