Render Times 16 bit or higher (3D Models)

edited December 2017 in Pro Support

Let me start by saying I know all the factors that cause slow rendering.

This question has to do with the color depth with the new Hitfilm Pro.

In the past ver 3, 4 and 2017 changing the color depth to 16 or even 32 would only increase render times by about 10%. Now I have noticed that if I try and render anything above 8-bit  the render times seem to increase massively. 9.21 sec clip (no motion blur) 5 minutes at 8bit to 6.54 hours at 16bit as example.

I'm working only with 3D models (no video footage) and a background. Model texture maps are 24bit 4k and the background image is 24bit 2k.

I would love to share the project file BUT I can't due to the fact contains a Jetstrike model and textures.

Aside: I sent the project file to support and they (Axel) have returned a verdict of motion blur causing crashes during compositing which I knew had caused problems in the past. I'm sure I've left it off when working and turn it on when rendering though. But I'll keep note of it in the future.

This is a side note pertaining to 16 bit and crashes while working with split views. I have another project that contains Krammer's Star Wars pac (corridor and R2D2) that I built to test the new spherical or planer reflections that crashes if I try and split the views while in 16 bit or higher. I get the same issue when working with the Jetstrike models.

So...question. Is anyone else experiencing any issue like I have described above?

 EDIT: Levels, curves, blur, letterbox are the additional effects. All of them are disabled. No effect on render time at 16 bit +. All paths point to motion blur being the culprit in the case of the Jetstrike render


  • I've never actually looked at that. So little have I used 16-bit. Mostly with particle sim particles with a very low alpha and building up the image with a crud load of particles.

    I did try one project. 1080p30. 3D models, particle sim on models and motion blur. Some fractal noise with sphere for planets.

    I rendered to two different formats. One of which can use more than 8-bit (Cineform).

    The first two renders are 8-bit. The second two are 16-bit. Noticing a little difference in the MP4/AVC  export, I ran those a second time. The 5th is 8-bit and the last is 16-bit.

    So there is a bit of variation in MP4/AVC output. Maybe some breathing in and out of the AVC encoder. It's almost like a step function. It seems like it might be independent of 8/16-bit. I would need large number of runs to  see if that result variation is repeatable.

    The results in this test run is that 8-bit and 16-bit are statistically a wash.

    4Ghz i7 4770k, 16GB ram, GTX 1080 8GB (boost clock ~2Ghz)

    One thing we can say is that a 16-bit project going to an 8-bit encoder has to convert frame data from 16-bit float to 8-bit integer for the encoder. This work is fly poop relative to the other business this project is doing.

  • Um.  Same machine basically except I have 64GB ram and a GTX 1070 (8GB). Not hip on the boost clock.

    I guess I'm going to have to back up a little bit and do some trouble shooting. Thanks for the test/ reply sir

  • One more thing if I may @NormanPNC

    Nvidia driver - I actually see I updated the driver to ver 388.43 (11-30-2017)

    Is your driver older by chance?

  • 388.13 

    I usually update about every other driver version.

Sign in to comment

Leave a Comment