Video Card Memory

So, I was watching a Hitfilm tutorial and decided to download the download file; Ram and proxies. However, when I opened the project, this appeared;

"Your video card does not have enough memory to support a resolution of 2048 x 1152 with 8-bit color and 4x MSAA antialiasing mode.

Please lower the resolution, reduce the bit depth or change the antialiasing mode.

You are very likely to experience visual problems if you continue to work with the project at its current resolution."

What do I do? It still buffers even when I turned down the resolution and I want to have it at 100% percent. Do I upgrade my video card's memory or? This is information about my computer, if it helps;

2,6 GHz Intel Core i5

16 GB 1600 MHz DDR3

Intel Iris 1536 MB

 

Comments

  • if you are on a  notebook\laptop ,...might not be much that can be done .

    using  AMD  A10 APU here,with  2 gig ( all of VRAM comes from  system memory) Hitfilm lets it max out at  4096 x 4096 ( won't let the digit boxes go any higher ) .  No  8 k  here .  if it could go to 8k , then  24 meg pixel images could be imported at  6000 x 4000  full res ,...Not here .

    Even though the  Video card properties  says the AMD  GPU can use up to  9 gig of RAM  doesn't  matter .  Hitfilm is very  strict on actual  VRAM .

    the spec to get  4k  requires  2 gig of VRAM .  some  notebook PC's  have a  discrete  video card with 2 gig of  dedicated VRAM . that would work for 4k.

    Intel Iris , at 1.5 gig ,...well seems  Hitfilm  is really specific with it's specs . gonna need that  2 gig VRAM .

    to get to  8k , i think requires more then 4 gig VRAM ,...8 gig would nail it .

    2 gig VRAM  here , lets it run  at 4096 x 2304  with 32 bit linear color .

    your machine is good enough , just gotta have that 2 gig  VRAM to get to 4k

  • Triem23Triem23 Moderator

    If you're on a laptop there's probably NOTHING that can be done, unless your laptop happens to have a Thunderbolt port--and, based on your specs, you don't--which would allow you to use an external GPU in an enclosure.

    Otherwise, laptops are built differently from desktops to cram components into a smaller space. Laptop CPUs and GPUs are basically impossible to upgrade.

    The Intel Iris you have is built into your CPU. It is not possible to upgrade the memory of that chip.

    @humptylumpty you MIGHT be able to load your stills at full-res, if not create a timeline greater than 4096x4096. Also there's a processing power factor to timeline sizes, not just RAM. My laptop's GPU is a 980m with 8GB of VRAM, but I can't create 8K timelines. I have more than enough vram, but the card doesn't seem to have the raw power (note a laptop 980m is about equal to a desktop 770).

  • Hmm. Cause,  I recently read on the web that you can actually just add RAM. I don't know about that, I'm gonna go get help from a tech pro in the weekend. Hopefully, I don't have to get a new computer, I don't have that kind of money.  I'll just have to wait and see.

  • edited February 2

    @Triem23 ,... "Project" Setting's digit boxes max at  4096 x 4096  on this PC .  you say to make a Time Line  greater then  4096 x 4096  ?  not sure what that means .  Hitfilm has lots of tweaks and settings ,... still trying to get used to it .  i did make a  sample test  .png image at  8192 x 8192  using  Microsoft Paint .  The  imported image had a Zoomed  look to it , so i guesss that  says it was being viewed as  4096 x 4096 ( or something less then  8k )

    if i  "Make New  Comp" , those digit boxes are also clamped at 4096 x 4096  max .

    HItfilm  did  import it .  but the  rendering still  max's  at 4096 x 4096  via the  "Project"  digit boxes.

    your  8 gig VRAM won't let your digit boxes go to 8k ?   whoa !  puzzeling .

    Hitfilm  .pdf   says that 1 gig VRAM should be able to support 1920 x 1080 .

    Not sure how  Intel built  iris  and the  BIOS . The AMD  A10  FM2+ , motherboard , BIOS has  graphic RAM  settings , so it is set to use 2 gig of system memory  i could set VRAM  to 1 gig to test with .

    i am surprized the iris laptop can't do  1920 x 1080  using  1.5 gig  VRAM .

    i am using  W10   version  10.0.16299.214

  • Triem23Triem23 Moderator

    @humptylumpy right, so Hitfilm is loading your 8k image into a 4k comp. This is normal. Editors default to scaling media to fit frame, where compositors/VFX programs default to actual size (often an oversize image is desired).

    No being able to do 8k on my 980m is also normal. That 8GB of VRAM is tons of memory for game textures and for 3D models in Hitfilm, but the GPU itself has a max resolution of 4k. It's not just memory, it's how much resolution the card can output. Probably only current generation cards might be 8k capable since there really aren't any 8k displays out there (other than expensive prototypes). 8k in Hitfilm is more future proofing than currently usable. A lot of the older cards might have enough memory for 4k but not actually have 4k resolution output. Hitfilm will not create a timeline larger than the GPU can display, no matter the VRAM.

    The size of an asset than be loaded also depends on the GPU. Some GPUs might max at 6k for an asset, others 8k, some up to 16k.

    Max resolution for your cards can be looked up online. I think that the Iris's with 1.5 GB were built for the Mac market. Apple has that odd 1440p resolution that nothing else uses. 2550x1440 or thereabouts. That's about 2.7k which would sit perfectly between 1080p and 4k.

    Anything "Intel HD?" it's in the name. Those are designed for 1080p. "Intel UHD?" those are 4k. Iris depends on the generation. 

  • @Triem23 Hitfilm can display a timeline larger than the GPU can display.

    At least I can create a 4K timeline and edit it on a normal 1080p monitor and its max resolution is 2560x1600 (GTX 580).

    Or is that not what you meant? I recently used it to export some 4000x3000 GoPro images after I'd imported them as an image sequence and used the Action Camera effect to remove the fisheye look from them and did a bit of colour correction.

    It only has 1.5Gb of RAM, but it can share 2.5Gb with the system to a max of 4Gb, so .... clever swapping about?

  • @Triem23 ; you sure have some interesting things to say .  i never thought\heard  of it that way .

    i do know this  AMD  A10  APU  supports  4k on HDMI , from reading the  manual .

    wow !  you just said  someone needs a video card that  supports  8k on the  HDMI ?  wow !

    i am still  not sure what that has to do with  an  Up-scale  render in software . why can't  Hitfilm allow the  digit boxes to go to 8k  for  rendering 2k\4k  upscaled to  8k . ,...well ,...ok .   Photoshop seems to import\output all sorts of  resolutions , on mainstream  video cards and mainstream PCs .

    when i  manually  type  819   into the digit box , it  prevents me from typing the 2 .  ok ,...i have heard enough .  No  8k  here .

    P.S .  Hitfilm's  .pdf  manual  talks about  VRAM  requirements , but i can't recall  any talk of the  video card needing to support  8k .

  • Ok, so. I went to a tech expert and I had enough ram but my video card's memory was the problem. So, I went back to my composite shot, proxied it, and then ram at the beginning to check. I then turned up my resolution to full and exported it to my desktop in a quicktime file. When I viewed it in RAM it looked good, but when I had exported it, it was in low resolution. Thoughts?

  • @Mercesoul Double-check the settings of the export preset you used.  Not all presets default to match the timeline's resolution.

  • I have read diagonally but this is how the timeline limitation works: HitFilm will look at how much memory the GPU has and then figure out what is the maximum resolution that you can use, taking into account other settings like the color depth.

    At first glance, it may look like your GPU should be able to support more than what HitFilm says is the maximum but you have to remember that depending on the project, each frame may be using multiple textures at the same time (think light, shadows, environment maps, etc) and all of those need to be able to fit in GPU memory as well for the frame to render successfully (I am wildly oversimplifying here but this is the gist of it).

  • i am starting to believe what you folks have been saying .

    i finally looked at  Device Manager  GPU , when i had a very simple scrap project set at 4096 x 4096 ,  next to nothing in the Comps , but  GPU mem usage was  peaked around 3.7 Gig , before  HFP  crashed .

    PC is running  AMD  A10  APU , with  2 gig VRAM as set in BIOS ( all of it comes from system memory ) .  this APU says it can access up to 9 Gig , but it seems 4 gig stresses this  APU .

    typically , i have seen   GPU usage at about 1.5 Gig +-  when viewing Device Manager with Project set to 1920 x 1080

    i think this  APU  max's at  2560 x 1440 , and the  motherboard HDMI port can support  4096 x 2160 .  i See  4k  presets  in  HFP .

Sign In or Register to comment.