Some remarks on scale and hair

hair_test_01
Where scale here stands for the relative size of  things rather than the parts constituting the skin of a dragon.

During work and planning for my short ‘Ara’s Tale’ I explored some intrinsics of blenders hair implementation in regard to actual size of models in combination with animation. What seemed rather simple in the beginning turned out to be quite tricky to apply for my needs.

But first things first.

I remembered from my previous projects, that the old hair system used pixels as measurement for the width of the single hair strands. This had bad effects when changing the resolution of the render or moving the object farther away, as the strands kept their thickness in pixels and the strands looked thicker in lower resolutions.

The new hair system came with the option to define the strand thickness in blender units, which seems like a very good idea, as its now independent of the actual resolution the hair gets rendered.

The lowest value to use is 0.001 blender units. In Ara’s Tale  I have a ratio of ~1:2 i.e. 2 blender units correspond to 1 meter. It should have been obvious, that this would not have the desired results, as 1 hair is still 0.5 mm thick in this scenario.

See here for a render with these settings.

Thick hair

Well, the obvious idea is to increase the scale, lets say a factor of 10, which make a hair now 0.05 mm which is more correct.

See here for a render with these settings

hair_bu_01

This closeup looks better of course, but there is no headroom in manipulating the thickness to lower values if necessary so it might be better to increase the scale again by a factor of 10 and arriving at a ratio of 1:200 meaning that 200 blender units represent 1 meter.

See here for a render with these settings, where the hair thickness is set to 0.008 blender units.

hair_bu_02

This immense scale gave me a an uneasy feeling. I knew there are limits in the other direction as the maximum value of a myriad of parameters all correlating to size. What if I ran into the wall there? How does the whole system react to such a huge scale?

Before getting too much headache I did some test renders were the hair uses only a fraction of the available frame, which gave me something to think again.

hair_bu_03

As you see, the size is taken into account correctly, but with the effect, that the hair starts to appear too thin and ‘disconnected/pixelated’ when viewing from far away.

Well,  back to specifying the hair thickness in pixels then.

See here a closeup (now with scale ratio 1:2) and the hair thickness set to 0.4 pixels. (Original render was at 1920×1080 scaled down for better viewing)

hair_px_01

Now that is fine so far. Now lets go farther away to see the pixel effect.

hair_px_02
It’s not so bad after all, but the hair now looks a little bit too full and thick. This is were I started experimenting with the strand simplification settings. I observed, that it removed strands as the object gets smaller on the render and also seems to render the strands a little thicker the more strands are removed.

See here now for a new render this time with strand simplification turned on.

hair_px_03
Now this looks quite ok, … so far.

I then asked myself how smooth the effect of removing the strands is during animation. I have a shot where I start from an extreme closeup on Ara’s face and move the camera away until Ara is only a quarter of the frame’s height. Any discontinuities on the hair should be immediately visible.

See here for a two quick test animations simulating the shot, the first one rendered normally the other one as with simplification turned on.

hair animation test 01 from loramel on Vimeo.

hair animation test 02 from loramel on Vimeo.

And just as feared, the simplification is plainly visible on the frame boundaries of 0.14/0.15, 0.20/0.21 on the left side and 0.22/0.23, 1.03/1.04, 1.09/1.10 and 1.21/1.22 on the right side, with the last one extremely obvious. What is strange here though is that the simplification not only gradually removes strands but switches child density for single parent strands

I did a last test and added some wind to have the hair moving, hoping to cover the too obvious strand removals. See below for the same  shots now with wind.

hair animation test 03 from loramel on Vimeo.

hair animation test 04 from loramel on Vimeo.

As it seems, the wind helps, and I have to wait until the animation is all done to decide which option is viable for my purpose.

—–

I had a similar extensive research going on regarding cloth simulation on linked assets and may prepare another post to tell my findings on that topic.

Author: loramel

Test

8 thoughts on “Some remarks on scale and hair”

  1. Wow, lots to learn about Blender’s hair system… I didn’t even know it changed due to scale and camera positioning 🙁 No wonder I was fiddling around with this like an idiot! At least I got a hedgehog’s fakir-fur look right accidentally. Good I didn’t try any animation! Many thanks for the insights, always a good read!

  2. But luckily in 2.5 basically everything will be animatable, so it should be possible to compensate animating parameters according to distance/scale, that could even be integrated… Hmm…

  3. I contacted Jahka because of this issue, and he said that’s be Brecht’s job – so I’ll pass it on to him 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *