Is my computer still valid, or a Flintstones fossil?

When I built the machine it was extremely fast. What's the jury opinion today?

Windows 10 64bit
ASUS Z79c motherboard
ASUS GeForce GTX 760 video card
SSD main drive
SSD drive for my work files, cache & database 

A big case for good airflow
A fridge and a snack shelf, fully stocked for excellent nutrition

When I edit, I use a clean, separate user account -- highly advised.  It did speed things up.
Where's the bottleneck?   4K footage is only a dream for me until Hitfilm gives us proxies.


  • What's your CPU? 

    Drive model #'s?

    I give you my honest opinion as always.

    Hitfilm does have a basic proxy workflow, it was renamed to pre-render. 

    However that should be reserved for effects not strictly playback if it can be avoided.

  • @Davlon looks pretty good all around except - maybe the GPU.   But as soon as you upgrade here comes newer tech that jumps higher and runs farther.     When you do pull the trigger, get as much beef as you can afford.

  • Hard to say without the CPU? Depends upon what your doing in Hitfilm. 4k will struggle on a GPU with 2gb but otherwise it should work.

  • "Hitfilm does have a basic proxy workflow, it was renamed to pre-render. "

    Disagree. Hitfilm pre-renders are not at all like a proxy as the term is commonly used.

    Proxies are typically lower rez and/or lower quality versions of a source file. The lower rez/quality providing for lower edit overhead.

    Hitfilm pre-renders are lossless, full rez files. While they will likely decode fast they have a very high I/O burden due to the lossless compression used.

    As for lower rez edit overhead we can easily get some of that with simply setting the viewer to Half. The timeline runs at half rez. Decode is still full rez. Cineform is an exception here. The nature of Cineform allows for faster decode to a lower rez, like half. Still full rez 4K Cineform can have an I/O burden above your storage abilities. Depending on circumstance of course.

  • edited February 4

    I forgot the processor -- full specs:

    Windows 10 64bit
    i7 on an ASUS Z79c motherboard
    ASUS GeForce GTX 760 video card
    32 GB RAM
    SSD main drive
    SSD drive for my work files, cache & database 

  • I forgot the processor -- full specs: ... i7 on an ASUS Z79c motherboard

    i7 does not say much. It is like saying I have a GTX video card. The actual model number is what matters.

    I think you mean your MB is a Z97c and not Z79c.

  •  It does tell us that it's an i7 4770/i7 4770k/i7 4790/i7 4790k though. Either of these are good and if you got a K version then you should definitely overclock as that motherboard has got great VRM.

    Like stated earlier, your graphics card could use an update but only if it no longer does what you need it to. There were 4 GB cards as well and even if it's a 2 GB card, it's still plenty fast for HitFilm. There are lots of people using integrated graphics in HitFilm as we can tell by the number of threads about them, so upgrade if you feel that you need it, not because you must because it's still sitting way above min req for 1080 and likely 4k if it's a 4 GB card.

  • I have similar spec but with i7 3770K (not overclocked).
    It struggles at times but I just needed to get a little more clever with the quality settings and think about my workflow a little more. Completing a certain 'layer' of a scene,. rendering it and then bringing it back in as an mp4 for more work. Hope that makes sense.

  • edited February 5

    Okay, here it is again -- my bad for the omissions:

    Windows 10 64bit

    i7 on an ASUS Z97c motherboard   
    Intel Core i7-4770K Quad-Core Desktop Processor (3.5 GHz, 8 MB Cache, Intel HD graphics
    (not overclocked -- Norman, would this help?)

    ASUS GeForce GTX 760 video card
    ASUS GTX760-DC2OC-2GD5 GeForce GTX760 2GB GDDR5 256-bit
    DVI-I/DVI-D/ HDMI/DP PCI-Express 3.0 SLI ready Graphic Card OC-selected 1072 MHz core

    32 GB RAM
    Kingston HyperX FURY 16GB Kit (2x8GB) 1600MHz DDR3 CL10 DIMM
    SSD main drive
    (SanDisk Something)

    SSD drive for my work files, cache & database 
    (SanDisk again)

  • That's a better system than mine (better CPU and I don't use SSD for everything) and I manage.
    I don't use 4K and my projects are small. I tend to do 20-30 second scenes and then gradually stitch them all together to make a 4-5 min video.

  • edited February 5

    "Intel Core i7-4770K Quad-Core Desktop Processor"

    That is the exact processor I had for years until I built my current machine about 7 months ago. (9900K CPU) I just moved my GPU to the new machine. a GTX 1080.

    It is a good CPU. Even today. I did have mine overclocked. Sort of. The CPU has a max turbo clock of 3.9Ghz. That is single core active. 3.7 max turbo at 4 cores active. What I initially did was just remove the whole turbo thing and ran the CPU at a flat clock rate of 3.9 GHz. 4.0 is a nice round number so I shortly just bumped it to that. A pretty minor 100Hz overclock since 3.9 was certified. Many cores at high(er) clock is about keeping the CPU cool. I had a very good 120mm tower CPU cooler so heat was not an issue for what I was doing.

    I would say your GPU is on the weaker side of things. The 2GB GPU ram is probably limiting.

    For 4K work I would say an 8-core CPU is desired. This for CPU decoding of video media. If the media is 8-bit 4:2:0 AVC then the hardware decode in Hitfilm removes a lot of need/desire for a lot of cores. On the media input side of things.

    Here is a very old thread I posted on a quick and dirty test of 4K performance on my old 4770k machine with Hitfilm 2017.. I did a harsh composite test. 4 simultaneous streams.

    Some notes of changes since that video. Current Hitfilm is slower since that version tested. Hitfilm 2017 (aka version 5). Certainly Cineform and maybe Prores. Last I tested this was version 12/12.1. With AVC 8-bit 4:2:0 media Hitfilm performs dramatically better due to the hardware decode (version ?). AVC HW decode gives the best timeline performance of any codec.

    Other differences. Your SSD work drive eliminates the I/O bandwidth issues I had in my test (spingle HD). The spindle HD is why I forced the media into disk cache. With my 16GB system and 10GB of data I had to take special measures to get the dataset cached. On that note, your 32GB ram makes it *much* easier for a group of media files to be cached. Hitfilm really, really does not use much ram. But having excess ram allows the OS to cache the data removing potential I/O bottlenecks.


    So I would say your machine is still relevant. SSD I/O...very good. 32GB ram way very good. CPU pretty good depending on projects. A mild overclock, which is easy to do, can't hurt. A better GPU might be in order. As usual it always depends on *exactly* what you are doing.

    As for the CPU overclock. It won't give you seat of your pants speed feeling but is measurable. No need to mess with manual voltage changes. Leave the CPU in dynamic/variable voltage. Actually you can offset it negative since the Intel numbers are conservative (unless you lost the silicon lottery and your chip is a stinker). Just have a decent 120mm tower cooler. I played with this very briefly on my 4770k just before I built my current machine. I did not try pushing my CPU to very high clocks, but rather I was pushing the voltage lower (negative offset) at the same 4Ghz clock.


    Remember if you buy a $400+ GPU and then later decide you want a new MB+CPU combo that GPU easily moves to the new machine.

  • @NormanPCN

    Thank you for your excellent  help.   We're all grateful, and it brings the forum tremendous benefit.

    I hadn't planned on buying a new GPU -- let's see what overclocking can do.

    Can I rely on the Intel fan that came with the chip, or do I need a better heat sink?

  • @Davlon "Can I rely on the Intel fan that came with the chip, or do I need a better heat sink?"

    Wow, I have no idea. I would prefer to talk from experience or what I read from reputable sources. The tech community really does not care about "stock" CPU coolers. Frankly because they have always been pretty weak. The only exception is the recent AMD Prism stock cooler which from what I read is not too shabby...for a stock cooler. Frankly you can get a decent 120mm tower cooler for $30. Something like a Coolermaster Hyper 212. The mount is not as clean/easy/solid as something like a Noctua. I had a Coolermaster 212 on my machine when I had an i7 860. My 4770k had/has a Thermalright true spirit 120 with a Noctua fan. I have since become something of a Noctua fanboy. Frankly a tower cooler will cool better and likely be quieter as well.

    Wow, that was a bit wordy even for me. About your stock cooler and overclocking. I takes zero effort to give it a go so do it. Just realize with a better cooler you can get much more performance out of your CPU. But it takes nothing to try with your stock cooler.

    To get your feet wet and make things as easy as possible to test some overclocking, I would suggest you DL and install the Intel XTU (Extreme Tuning Utility). No need to mess with the bios for basics. XTU is a windows program where you can setup OC without all the rebooting to BIOS and back. The BIOS can be more cryptic but also gives more control. I played with XTU a little on my 4770K machine. My MB was an ASUS z87 Deluxe. Once you have basic settings ironed out you can uninstall XTU and then move those settings into the BIOS. The BIOS mave have some VRM settings which could be useful. Honestly with a stock cooler you care going to want to control, heat and therefore control voltage. As I said, I would stick to dynamic voltage control and figure out how much *negative* you can get away with. XTU can properly monitor CPU temps. A good utility to have for such things is HWINFO64. It does not even need to be installed. A portable version in ZIP form exists. I use the portable version. I have a utils folder with a lot of portable (no install utils).

    As I said, I had (heck I still have) a 4770K system and I played with XTU on that system.  With a GPU it is should still boot. Or just use the integrated GPU in the 4770k. Heck I still have a GTX 980 on the shelf collecting dust. I should be able to help or re-aim you the right or at least a better direction if you have a problem trying things on your own.

    Just remember what the CPU can do for you in Hitfilm. It only helps video media decode and export/encode. Overclock can help part(s) of the particle simulator. The CPU will not help effects and compositing and such. That is much of the GPUs job in Hitfilm. With media decode, if your media is AVC 8-bit 4:2:0 then hardware AVC decode takes the CPU out of much of the decode process. As long as Hitfilm supports your GPU for HW decode, which is likely with a GTX 760.

  • edited February 6

    Okay -- if I'm buying a new graphics card, what's the best card for me?  The GTX760 cost me $280 in 2014.  Given my mobo and setup, which GPU is an excellent value today?   

    Maybe a card with 6GB?  8GB cards are way high.
    (Wait a minute -- are these prices crazy high because people are buying them for bitcoin whatever?)

    I'm mostly concerned with increased renders and compression.  I'm not doing much SFX work.  But I do get into a bottleneck when I'm taping a day of classes at a 3-day event, and I need to edit a piece, render and upload quickly.  It's extremely uncomfortable with my partner (working on one of those round Macs in Final Cut) finishing his videos and uploading while my machine is still working.

  • @Davlon Don't worry it is. PCI Express (the card slot) is quite generic. Everything works with effectively everything the same.

    "I'm mostly concerned with increased renders and compression. I'm not doing much SFX work."

    This statement in your previous post makes me want to enquire more. By renders I assume you mean export.  A GPU does not change export independent of anything else. The GPU affects the performance of Hitfilm effects and compositing. The GPU is not used for file encoding. AVC or Cineform. The CPU does that specific work. This is not to say that a faster GPU cannot affect finel encode speed. This is because effects and compositing are of course done during final encodes. Those are always done. Playback and encode/export.

    Let's consider my previous post this thread.

    "Just remember what the CPU can do for you in Hitfilm. It only helps video media decode and export/encode. Overclock can help part(s) of the particle simulator. The CPU will not help effects and compositing and such. That is much of the GPUs job in Hitfilm. With media decode, if your media is AVC 8-bit 4:2:0 then hardware AVC decode takes the CPU out of much of the decode process."

    So "increased renders" I take to mean export speed. I do not understand what you mean by "compression".


    edited February 9

    I got a 1660 Super this Christmas and its really powerful for the price. In the current market it bridges the gap between low end and mid end. Mid end cards start at RTX 2060/S, Vega 56 and 5600 XT, up to 2070/2070S and high end at 2080+.

    The difference  between 6 and 8 GB VRAM could be noticeable in HitFilm if you work with heavy effects or models. VRAM is faster, once thats out it goes to RAM, once thats out it goes to disk. The more you can fit in VRAM the faster itll render, which is just one of many factors for VFX processing.

  • edited February 8

    Okay -- let's try this:

    1. What's the best Hitfilm graphics card in the $300-$400 price range?

    2.  What's the best Hitfilm graphics card in the $400-$600 price range?

    3.  What are the feature to look for?  @NormanPCN ;mentioned AVC decode.
    What other features make a great Hitfilm card?

    Also, is Hitfilm optimized (does it support) a dual-CPU graphics card?

  • Nope.  No dual card support for HF currently.

    edited February 9

     1. RX 5700 XT

    2. RTX 2070 Super

    3. Norman will correct me if Im wrong, but HW decode is about it. You just want a fast gaming GPU with lots of VRAM.

    "Also, is Hitfilm optimized (does it support) a dual-CPU graphics card?"

    Ill elaborate on @Stargazer54 response by adding: A single card with 2 GPU's will still operate via CrossFire or SLI, and thats what HitFilm doesnt support.

  • @NormanPCN

    I found this page/video in which the reviewer describes Vegas rendering issues.

    Can you give us your expert analysis?

    edited February 11

     Thats probably a discussion for a new thread, Vegas is not HitFilm. @Davlon

    I answered your questions. Are you looking to upgrade from your GTX 760?

  • edited February 11


    Yes, an upgrade. At the present I'm looking at


    It's the same chipset.  Why does the ASUS need an extra fan?   My case can handle it, but it will mean major detangling and rewiring.

  • "3.  What are the feature to look for?  @NormanPCN ;mentioned AVC decode.
    What other features make a great Hitfilm card?"

    What makes a great card...big and bad and fast and of than means $$$.

    If you say I do X and I want just enough GPU to have that work in smooth real time playback (assuming timeline is smooth before effect(s)). I can't answer a 'just enough' question.

    An Nvidia card may be a better choice, WRT brand, if only because FxHome has much more experience with the Nvidia platform/driver.

    AVC deocode. Your old 760 has this as per the chart fragment I showed you. Here is the link to that full chart for reference. A newer card will have more codec support and maybe a faster AVC implementation.

    Newer cards will have HEVC decode support. Now you may not have a camera that generates that right now but soon you might. A phone probably currently does. Hitfilm does not support HEVC but they have stated they want to.

    I don't know the situation of the royalty fees WRT the hardware HEVC decoder but it is really very likely the cheapest way to support HEVC. And honestly you really do not want to do software HEVC decode anyway. In general probably even with fast editors and absolutely not Hitfilm. If the HW decoder is not royalty pass-thru to then you are covered. Hitfilm currently does not do any codec support hardware only, if you have it, and maybe they don't like the taste of that.

    "I found this page/video in which the reviewer describes Vegas rendering issues. "

    What Vegas does has no direct relationship with Hitfilm. Completely different animals. What helps Vegas may mean not much for Hitfilm. My history with Vegas is that is has not performed well on Nvidia GPUs. Last I checked at V16 it still sucked. Vegas has/had a very AMD biased implementation. It happens. Vegas uses OpenCL and while OpenCL is hardware agnostic that does not mean the code will perform similarly.

    A very public view of this happened with the Luxmark OpenCL benchmark software. This benchmark was often used to state/show that Nvidia had a poor OpenCL implementation. LuxMark is the benchmark of the opensource Luxrender 3D render engine. Nvidia probably got sick of getting slammed and since it was open source they look at the code and effectively told the developers, hey, you are doing it wrong/bad. Suddenly Nvidia did well in LuxMark. You cannot compare the numbers from Luxmark 3.0 and 3.1 due to these changes. Both AMD and Nvidia implementations got faster in the benchmark but Nvidia more so.

    A similar thing happened in a public Youtube video with an Intel OpenCL engineer and Davinci Resolve OpenCL code. Yes, even a wimpy Intel integrated GPU can do more/well with efficient code.

  • edited February 14

    Okay -- Done deed.  I now have an RTX 2060 with 6GB in the machine (from Gigabyte — no raves / no rants please)

    Opening the Nvidia control panel shows a list of options you can set for each program in the 3D section
    1.  Even though Hitfilm make minimal use of a GPU, can I ask the forum for best settings?

    2. I'm currently doing the majority of my editing in Vegas 15 -- can anyone tell me if the best settings for Vegas? Granted, this is not a Vegas forum, but it's the friendly forum.

    3. I'm assuming in the Video section, the color adjustments should be set to "video player" as opposed to Nvidia?

  • "Even though Hitfilm make minimal use of a GPU, can I ask the forum for best settings?"


    You may change the power management from default(optimal) to max but I have not seen any negatives from Optimal. Noticing the variable GPU clock when Hitfilm is playing I did try some tests and Nvidia seemed to be doing a good job adjusting clock.

    "I'm assuming in the Video section, the color adjustments should be set to "video player" as opposed to Nvidia?"

    Yes. That should be the default these days.

Sign In or Register to comment.