Renderman

From cgwiki

Renderman

Volume Density for Emission

Prman density for emit.png

File:prman_vdb_density_as_emit.hipnc

Had seen a few people get great results by mapping density directly into emission for volumes. Tests inside Mantra looked great, so pushed that workflow to the studio. I then totally forgot we'd have to match that in Renderman for the lighting students using Katana, here's the results of that test.

The high level idea is to take the incoming density (using a Pxr Prim Var node, NOT a default Houdini bind!), and remap it twice; once to sit in a nice 0-1 range for density, and then into values that correspond to blackbody values (roughly 1000 to 6000). The second is fed to pxr blackbody node, and then to emission.

Unfortunately this made all the low density areas of the volume glow red; the blackbody node doesn't seem to want to go down to pure black values. I figured if I could multiply the emission values against density, that would clip it back and look correct.

To my surprise there's no pxr mult, pxr add etc nodes. As far as I can tell, you're meant to use SeExpr to do this sort of stuff. Easy to make the node, but spent 45 minutes shouting at it when it didn't do what I wanted. In the end the answer was embarrassingly simple; don't use semicolons, and the final line of your expression becomes the assigned value. So here, I had the remapped density and remapped emission plugged into the inputs, and the expression is simply

colorInput1 * colorInput2

A nice feature of this setup is there's an exposure multiplier on the blackbody node; very easy to dial the look up and down. And my word, it's fast, and does full gi emission against other objects with no extra effort. And this is prman21, can't wait to try prman22...

Prman emit exposure adjust.gif

Hair and rixlate

Pigfurprman.JPG
600,000 hairs, PxrHairColor and PxrMarschnerHair, 3 seconds to export the geo to renderman, 12 seconds to render at 1280x720

Rixlate? An anti dandruff treatment? Maybe...

The renderman docs are pretty clear about how to render curves, hair, fur. Create an attribute rename sop, switch to the renderman tab I've never noticed before, rename width to width, and set the mode to 'varying float'. This tells houdini to export the width attribute to renderman, and to make sure it understands the width can change along the primitive length.

I did this, then applied an attribute delete sop to tidy up the stuff I didn't need. To my surprise the renders went haywire, hair curves now the thickness of tree trunks. What happened?

I'd kept the width attribute, but also deleted a easily missed detail attribute, rixlate. DON'T DELETE THIS! Turns out that's the tag renderman uses to know what attributes to export and in what format. I lost a week to that. Now you don't have to.

Hair random colour and hair id

Fur rand wipe2.gif

The docs for the hair color shading node support some nice randomization features, which rely on having per hair @id. The houdini hair and fur tools generate this for free, but it wouldn't work in my render tests.

After trying a few things I realised that @id has to be promoted from prim to point, and then exported using the same method as @width shown above.

RIS render quality

Ris settings.gif

The docs state pixel variance threshold and min/max samples as how to drive quality, matching how most pathtracers these days do their thing. The variance number was easy to find under the samples tab of the ris rop, but coudn't see the min/max samples anywhere. Only found it via a screenshot in the renderman-for-houdini docs talking about something else; its under the hider tab.

The same screenshot gave a sense of what 'production' settings might be (the defaults are min -1 and max 0). Min 16 max 1024 make fur look pretty good, a HD frame takes about 20 mins to render, which feels reasonable.