Hello,
from the experience we have so far with USD at dneg we are bumping on a major issue: the USD memory breakdown. Things like layers memory, instancing memory, number of instances, number of prototypes used, etc.
After some time investigating we can not find a proper way to know what is taking memory from a scene loaded in a regular 3d software or from a third part renderer, which from our experience with two of the main packages are now only able to report 15-25% of the full memory usage taken by the render, so completely missing the USD data. This makes any attempt to optimize scene a real burden.
I looked into the API, the Git repo issues, proposals, wiki, and I can not find anything regarding that subject. So I wonder if I am missing something, if others already reported the problem, and if there is already some work going on for that ?
So any information would be greatly appreciated!
cheers
Hi @laurenth , USD can, with some work and caveats, provide fairly granular memory statistics - at a steep performance cost – using the TfMallocTag infrastructure, with which we have instrumented most of the code. Here is a usd-interest thread that discussed how you might be able to get it working, on linux.
We have, at various points, used this offline to profile the memory useage of a shot stage in its default state, and put some summary stats in layer metadata on the stage, but mostly it’s used for debugging on very large shots.
One caveat is that TBB’s memory consumption is not captured by MallocTags, and we do use TBB containers in several places.
I think a very interesting line of investigation would be to use ML to derive an estimator for USD memory consumption, based on some core stats like number of prims, number of layers, perhaps depth of hierarchy. Arun Rao did something like this for our previous scenegraph, TidScene, and it was used by our renderer to estimate TidScene’s memory use in production.