Hi,
I have been ask to implement a feature in our USD/Solaris pipeline at Dneg. Before I go to far with this I thought I should ask for advice here. Maybe somebody has done something similar.
I need to display a rectangle in the viewport with a texture applied. However, I need to do a couple of things with that texture that I don’t think I can do with a preview surface shader.
I need to be able to display data from the texture outside of the crop window.
I need to apply artistic color adjustments available as OCIO “looks”.
After thinking about this for a bit it seems like what work is a custom texture sampler that implements the behaviors I’m looking for.
This brings up a few questions?
Is is possible to register a custom texture sampler in a plugin?
The glsl code for the uvTexture node is intentionally left blank. I assume that some internal code detects the existing of a uvTextureNode and wires up a sampler object such that HdGet_diffuseColor does the right thing. I think what I am trying to do is to create a new node similar to uvTexture but gets wired up to a sampler that implements the behaviour that I need. Is that possible? Do you have any clues on how it might be done?
I just wanted to make sure that it’s clear that any changes you make involving glsl shaders are only going to apply to Storm, not HoudiniGL… Hopefully this isn’t a problem for what you’re trying to accomplish.
I didn’t really understand the description of what you’re trying to do, but have you looked at using Mtlx to describe the appearance of your rectangle? Storm and Houdini GL both support Mtlx shaders, at least in theory You mention creating a custom hydra prim (like hdTri?) which implies you are targeting only your own custom render delegate, in which case I guess you have a lot more options and control…
When I started investigating this I thought a custom Hydra prim might be a way forward but I realize now that is not really workable.
I always thought that Solaris was using Storm under the hood. Doh!
I think the main thing I need to accomplish is to build a custom texture node. I think a normal texture node will just read the pixels that in the crop rectangle that is stored in the image. In our workflow for camera image planes, artist want to be able to see “overscan” pixels.
I didn’t realize HoudiniGL supported materialX. I’ll give that a shot.
Sounds like you have a direction already, and MaterialX is a great thing to try, but just to clarify: you’re correct that hdStorm generates code for UsdUVTexture, as well as the other texture nodes. You can edit sampling parameters on the texture node, and those will be set in the corresponding shader sampler.
I don’t 100% understand your use case, but if you’re just looking to load up an image whose UV space is larger than [0, 1]^2 (e.g. the texture is mapped to [-0.2, 1.2]^2 but [0,1]^2 represents the “viewport” part), do note that you can transform UV space with the UsdTransform2d texture node: UsdPreviewSurface Specification — Universal Scene Description 23.08 documentation.