4k vs 1080p?

Which is better?  And can Hitfilm handle 4K video smoothly ?

Comments

  • If you mean more pixels are better than less pixels, then yeah, 4K is nicer.  But you need a beefy machine to push that around.  So the question on smoothness depends on your rig.

  • @RexLiew To answer your second question, HitFilm can handle 4K smoothly IF your hardware is up to the task. There's also other factors to consider such as how many effects you plan to use, and how optimized your source footage is for editing. As a simple (though I should mention, not exact) test, if your system can handle 4K gaming, it will probably be able to handle at least some basic 4K work without too many issues. It is worth noting that (other than some very unusual circumstances) 1080p will ALWAYS be less stress on your computer than 4K.

    Now for the first question. "Better" is relative. 4K has 4x the pixels of 1080p which means you'll generally see more detail and get a sharper image than you would with 1080p. As mentioned earlier, this comes at the cost of needing more processing power to work with it- especially when it comes to compositing and special FX.  Not to mention that from an artistic perspective, not everyone likes to have their film look super sharp and crystal clear (though that gets into resolution vs pixel count which I'm not gonna talk about)

    How about Hollywood? Surely they use 4k for everything? As you might have guessed by the sarcasm, they don't. If you're making a movie or show for Netflix, you're actually required to make it in 4K- largely because Netflix really wants to push it's 4K content to customers. That's great, but let's talk about Marvel  and Star Wars films... These are literally some of (and in Endgame's case, THE) highest grossing, biggest budget films of all time. But all their editing and FX are done in 2K (which is only slightly larger than 1080p for a wider screen). Why? Well for one (and probably the biggest one), some of their effects are so complex that even at only 2K, and even though they have rendering power many times more powerful than the most expensive consumer desktops, many of the effects scenes take DAYS or even WEEKS to render. Sometimes, a SINGLE FRAME of footage can take more than a day all by itself. That's insane. And when you have the tight production schedule of one of those films to work with, there's just not enough time to do it in 4k. Also keep in mind that these movies still shoot largely with film cameras- not digital. This means that after 2K, there's not much of a difference in visual fidelity when you digitalize it into higher resolutions. Probably not worth the massive difference in performance costs. 

    There's other things to talk about, but I'm hardly qualified to talk about it. I will note that on YouTube, compression is a terrible thing and the better quality footage you can feed it, the better it will look, even when viewed at lower resolutions. 

    What it comes down to for your purposes is this: If your camera (if you use one) is able to do 4K, it handles well on your system for your use case, and you like how it looks, then go for 4K. But don't feel bad if you need to drop it down to 1080p or, heck, even 720p to have a good experience. Your content will sell your content, not your resolution.

    I recently tried editing 4k for the first time on my system. It was a mess (my computer simply isn't up to editing 4k). So I downscaled all my footage and made my film in 1080p instead. While I'm sad that my computer isn't as powerful as I wish it was, I don't feel like I lost anything  after dropping down to the lower resolution. So don't feel defeated if 4K isn't a good fit for you. 

    Good luck!

  • " (though that gets into resolution vs pixel count which I'm not gonna talk about)"

    Resolution by definition is, essentially, the pixel count.

    Sharpness vs resolution is what you're actually talking about, and while a lot of people believe that more resolution = sharper image, that's false. More resolution means more pixels... sharpness is determined primarily by how accurately the image is focused and secondarily on factors like the strength of the OLPF and the nature of the image processing in the camera (yes, even with raw).

    Just to note, most of the Marvel films are not filmed on celluloid; that's become a rarity even in higher end productions. Most of the FX heavy films are filmed on Red cameras, which has been the case for a long time, but the main reason is the ultra-efficient 16-bit raw combined with more resolution than they need so that the finishing artists have more detail to key with.

    They're still finished in 2K because as @triforcefx described the CG elements are insanely expensive in terms of both labor and rendering time. 

    Do you need 4K? If you have to ask, then you definitely don't. You'd know it if you did. It won't make your film any better... but it does require more hardware to work with.

    That said, I have worked with 4K in HitFilm, and it handles it pretty well, if you're using an appropriate mezzanine codec and/or enough computing power. For 4K using a heavier codec like H.264/5 you'll need a beefier machine due to the nature of the codec.

    2K is still enough for most purposes. I tend to be a tad lazy and lump 2K and HD together because the only difference between the two amounts to a slight letterbox, so it boils down almost entirely to preference (slight letterbox vs no letterbox). Both work fine for theatrical projection...

     

  • '"Better" is relative.'

    I like that phrasing. "Better" must always be strictly defined.

    "4K has 4x the pixels of 1080p which means you'll generally see more detail and get a sharper image than you would with 1080p."

    Since we are in the would of compressed video 4K have more detail needs a proviso. It can have more detail if it has enough bitrate to encode that detail. Bitrate is effectively everything. What is going to deliver that good bitrate. Of course the same is said for 1080. Is the bitrate good enough for it to hold high detail?

    Today's bigtime source of video is cable/satellite TV delivery. They seriously (over)compress video these days. Much like online services like Youtube. My cable is FIOS fibre optic delivery. These days they deliver almost half the bitrate they did when I first got it. If I take a DVD, which is standard def (480), it will have more detail and quality than my 1080 cable service. This is sad. This is because the DVD is effectively high bitrate for the resolution than the cable delivery which is a modest bitrate for the resolution. Blu-ray is very high bitrate.

    So if 4K is your interest because of the extra detail then you must consider your content delivery mechanism and will it provide the bitrate to encode this extra detail you want. Then there is wanting 4K simply because... it's 4K!.

    'Better' can lead you going down the rabbit hole.

     

  • One advantage to 4K is shooting in 4K, but editing in 1080p.  That way (if you are lazy like me), you can do a zoom or move on the 4K footage and have room to maneuver in a 1080p window.  Plus (and maybe I am just imagining it) 4K rendered out as 1080p just looks better.

  • That last part you're either imagining or seeing the results of downscaling on soft shots. 

  • "That way (if you are lazy like me), you can do a zoom or move on the 4K footage and have room to maneuver in a 1080p window."

    My old business partner has told me some stories from his daughter's boyfriend who does contract video work. As Rich once said to me at beers, Rafa(sp) came back early from Berlin...Bono lost his voice.

    His big thing about 4K+ is post shot crop flexibility. The extra rez is not terribly important. But then he also tells me that in that world the client mostly tells you want you need to shoot on. It's often whatever has the cool/hot factor. Interesting with the tail wagging the dog. Not sure but I think Rafa has a RED Helium, which is 8K raw AFAIK. You can get 4K RGB from 8K bayer raw.

  •   Thanks for the detailed reply. I read it all one by one. So I’m sure this issue has been beat to death, but oddly I still have my doubts. I am not sold on 4K. I just got a 1080P projector. I sit 11.5ft away from a 100” screen. 
    The picture looks phenomenal. Could it be better? Sure. My friend told me that it'd be a totally different experience when you lay your eyes on 4K projector. TBH, I 4K is quite new to me. I found myself several articles related to 4k and 1080p. And what they claimed is 4K is a big breakthrough. It delivers a better-looking image, better color handling , saturation  and whatnot. It got smoother edges and depth. When combined with faster screen refresh rates, 4K has the potential to deliver almost as much depth as 3D — without the need for glasses. Sounds really appealing to me. I shoulda give it a thought.

    As to Hitfilm, I just purchased HitFilm 3  Pro to meet my needs of editing short films and videos. So  I'm wondering if Hitfilm has stability issues if you do large projects with it. Well, thanks again fro that timely reply. It's assuring. 

  • Triem23Triem23 Moderator

    Unfortunately at least the first article you linked to is factually incorrect (4K does not have TWICE the data as 1080, it has FOUR TIMES the data, and that's verifiable with a very basic area calculation), and rather misleading.

    4K has the POTENTIAL for "better color handling" IF and only IF the monitor actually the capability for HDR playback (hint, most monitors still for sale don't) and IF and only IF said video is encoded in an "HDR" format (hint, most video you're going to find on online services aren't and Netflix/Hulu, etc aren't streaming in "HDR.")

    Side note: "HDR" in video has come to mean  any video encoding that's using more than 8-bits/pixel of color space. This doesn't tell you if the video is 10-bit or 12-bit or 14-bit. To make matters more confusing, HDR comes out of still photography where an HDR image contained 32-bits/pixel color information (which cannot be displayed on any monitor, anywhere in the world) which would then be processed before being downsampled into a 16-bit/pixel or 8-bit/pixel image. And in photography a 16-bit image isn't "HDR," as we've had 16-bit images for decades.

    Norman brought up bitrate. Assuming your output is to YouTube or Vimeo we can look to see if the recommend encode rates ratio correctly. Youtube wants 8mbps for (30fps) 1080p but 35-45 mbps for 4k. This ratio is greater than 4:1, or greater than the resolution ratio between 1080p and 4K. This means that YouTube is pretty much encoding correctly to hold an identical level of detail (and YouTube has changed this in the last year. Last year they recommended about 20mbps for 4k.). Vimeo, on the other hand, recommends 10-20 mbps for 1080p but 30-60mbps for 4K. That's a 3:1 ratio, which is less than the 4:1 resolution ratio. This could indicate that Vimeo's 4K hold less detail than 1080p. One could argue here about the relative efficiency of h.264 and h.265, but Vimeo's guidelines don't discuss that, so I throw out that variable.

    Otherwise, as has been noted already, the biggest blockbusters in the world, with VFX from ILM and other world-class effects houses are all mastered at 2K. If the biggest VFX houses in the world aren't doing 4K then we really don't need to, either.

    Cynical note: TVs can last for decades. What's been happening for the last few years is TV makers frantically rolling out new tech to try and get you to buy a new TV/Monitor you don't need. High Frame rate was a test run out for 3D (a 240HZ refresh rate does nothing for 30fps video, but it's great to flicker LCD shutter glasses and left/right frames.). 3D failed because, once you account for people with monocular vision or the significant amount of people who get headaches and eyestrain from fake 3D you end up with under 50% potential market penetration. Now it's all about resolution (again, you don't need 8K when blockbuster movies are done at 2K) and "HDR" (which no movies or commercial TV shows are produced in). The stupidest thing is pushing for 4K on a smartphone screen where pixel densities end up being a good 10 times greater than what a human eye with 20/20 vision can resolve. Yup, 4K on a phone doesn't look any better than 720p, but it uses a lot more resources and memory...

    No one NEEDS 4k. It's best use is still for what Stargazer brought up - having extra resolution to play around with in your 1080p projects.

  • "You can get 4K RGB from 8K bayer raw."

    This canard is silly and ridiculous. We've been PRINTING images captured with Bayer pattern sensors for years, and it's fine as long as you don't up-rez in post, so the 8K for 4K because of de-Bayer overhead is and always will be a myth.

    HDR isn't tied to bit depth. I know there are a lot of people who think that bit depth determines dynamic range, but it's false. Length of the ladder doesn't have to change when you add more rungs. Bit depth determines how many rungs, the length of the ladder is the dynamic range.

    4K and better color are also not directly related, but they're being marketed that way to sell more TVs. You can't really get a TV that supports HDR that isn't also 4K, but "MOAR Ks!" is easier to market than HDR which the average consumer doesn't understand, and which the average salesdroid understands even less.

    That said, once you've seen HDR in person, you can't unsee it. The difference is glaringly obvious -- and when used well, it's stunning. When used poorly, it's no prettier than SDR and the antiquated Rec709 analog standard. 

     

  • "This canard is silly and ridiculous. We've been PRINTING images captured with Bayer pattern sensors for years, and it's fine as long as you don't up-rez in post, so the 8K for 4K because of de-Bayer overhead is and always will be a myth."

    Myth. That is total BS and we will have to disagree. Native pixel output from Bayer is interpolated plain and simple and math fact. Period. Humans see RGB and a Bayer only measures one of the three components per pixel if output at native res. In bayer at native rez output 2 of 3 RGB components are interpolated from neighboring pixels.

    I remember way back in the day when DPreview compared luma res quality of the Fioveon sensor (true RGB meafure) to the typical Bayer Canon/Nikon. This is so old it was a 3MP Foveon compared to a 6MP Bayer. The 3MP Foveon  had every bit the luma resolution of the higher MP bayer sensors and the reason is obvious as stated. Now Foveon has never had the chroma quality of the Bayer sensors used by others that it is about the implementation quality. Not the fact that Bayer is mostly interpolated mathematical data (at native res).

    The Canon C series cameras started out with a 4K sensor long before they supported 4K output. Why? To me obvious. 1080 16x9 output can be true RGB from a 16x9 UHD bayer sensor. Meaning that each pixel output has a *physically* measured pixel value from the sensor. Now if you bin or interpolate down is another discussion. You cannot have a "quality" 422 output from a non 444 source. Bayer at native res is not even remotely 444.

    To this day broadcast cameras use prizm splitters and 3 monochrome sensors to make up their image. Now broadcast cameras use tiny sensors so bayer would be hard to work with at that and you would not get good low light ability. There are, or can be, very good and real reasons to have a camera real and honestly measure a true RGB image at capture. Subjectivity ignored. Every application if of course specific to that task and it is okay for broadcast cameras to be big.

    I am not going to argue de-bayer quality. That is subjective. To each his/her own. My point is that to have a physically measured RGB value per pixel a bayer pattern sensor needs it resolution cut in half. Physically measured is "better", IMO, than mathematically derived.

    If anyone can every develop a good quality RGB sensor, bayer will be history faster than you can blink. Patent exclusions of course apply. Easier said than done.

  • edited September 18

    "Myth. That is total BS and we will have to disagree. Native pixel output from Bayer is interpolated plain and simple and math fact. Period. Humans see RGB and a Bayer only measures one of the three components per pixel if output at native res. In bayer at native rez output 2 of 3 RGB components are interpolated from neighboring pixels."

    I'm just going to come out and  say it, because I go with reality: you're wrong, along with all of the pixel peepers who obsess about "true 4K" and how you need subsampling to get "true 4K." 

    It is a statement of fact that millions of photographers print Bayer pattern images all the time. It's true that the Bayer pattern sensors require interpolation, but are you more interested in complaining or in creating movies?

    I've printed Bayer pattern images from a 36 megapixel camera at 24x36 and had them shown in galleries and curated shows, including a print exhibition in the Louvre. 

    So once again, you're wrong. You're wasting bandwidth, and you should be focusing on your craft and not on the sillineness about whether or not you're capturing a "true 4K" image or not, because it really doesn't matter.

    And don't forget... most movies that are shown in 4K IMAX theaters are finished in 2K. More than half are FILMED in 2K -- the most used camera in the high end is STILL the Arri Alexa -- which is only 3.2K when in open gate, and 2.8K in Super 35.

    So if 2.8K is good enough for Blade Runner 2049, then you're obviously wasting your time obsessing over it. 

    And yes -- Arri's Alev III sensor is also a Bayer pattern sensor, yet more proof that you're barking up the wrong tree. The limitation indies have isn't technology, it's craftsmanship.

  • edited September 23

    I have just made the jump to 4k everything. I tend to create 32bit Open EXR files in Blender (animation, but I do film too), I composit these in hitfilm with visual effects and I use particle effects too...lots sometimes. Which I then composit, colour grade and render in Vegas (I'd like the option to render 10bit uncompressed AVIs guys at Hitfilm if possible) - I know what sort of fruit and nutbar would want 10bit AVIs?) I'm obssesed with the best quality through the production and storage phase! Now I don't have a Hollywood super computer - I use an Acer Predator Helios 500 laptop...its a desktop in a laptop basically, which is adequate to create effects laden 3d animations. I could do 6 mins of animation at 4k in roughly a month. I will double and triple my render times over the next year through a dedicated render machine and an EGPU for the laptop. My point being - anyone can make a effects type Hollywood production these days so long as you know how - and you can be patiant. 

    To answer your question, which i kinda did above in regards to Hitfilm being able to handle 4k - the difference between 1080p and 4k should come down to a choice in aesthetics. It's that simple! I went to 4k, because as our moving image panels have improved which includes pixels - 1080p doesn't cut it anymore. But if you wanted a low-fi effect then it would be fine. You can also change the look of digital film - even more so than pixel count by changing the fps. 24 fps looks dreamy, 50 fps can look dreamy too but it has a smoother realness to it. 60 fps looks very video like etc. 

    Finally think about the distribution medium. If someone is delivering a product that largely is being viewed on a 1080p panel - then stick with 1080p. The same with 4k. 

    Ben 

Sign In or Register to comment.