Affinity Design/Photo + Hitfilm

So, when I'm not subbed to Adobe, I tend to be using Affinity Serif products. I know that there are some of us over here in this community that do the same, hence why I decided to bring up:

@Triem23 Believing you being one of them:

 

Think of the ability to bring in the PSD into AE (We know we can't do this in Hitfilm yet, which is killing me)

Do you or anyone know if there's a way in either softwares to automatically export the project layer by layer so I can pile it up all in Hitfilm? Sometimes, the project has ALOT of layers, so doing it on an individual basis can be a pain.

Now, I know there's something called Slices in Affinity, but the issue with using that tool is that it does NOT (as far as I can see) maintain the position of the layer.


So I say I make a complex lower third in Affinity Photo and use the slices tool. If I take all those exported JPGS in Hitfilm, I'll see a bunch of layers all aligned to the center, thus resulting in a mess.

Comments

  • edited May 21

    @Hictor ; Total shot in the dark, but have you tried each layer of, say, a PSD file as a separate PNG with transparency.  Seems to me that when I have done this with "hand drawn" animation once upon a time I was able to build the animation in one PSD with 15 layers then repeat it to make it cycle without alignment getting wonky.  It was a simple ball bounce and squash for a FutureLearn animation course; done with GIMP.

  • Triem23Triem23 Moderator

    @tddavis what he's lookin for is a way to automate the process so Affinity just exports every layer, "Solo'd" as a PNG so you don't have to keep enabling/disabling and typing filenames. 

    Incidentally, Hictor, I've asked for this exact function in Imerge. 

  • edited May 21

    @Triem23 ; Ah, mea culpa.  I misunderstood what he was saying.  I saw series of JPGS and all misaligned to center and didn't follow.  

  • Triem23Triem23 Moderator

    @tddavis well, the function Hictor is using is for web guys. It cuts the image in a way that trims the layer and centers it so it can be positioned and scaled in browser code. So it's a specific tool. But that means Hictor is having to reposition everything in Hitfilm.

    That said, slices can be saved as PNG, which is superior  overall, to JPEG, and can save alpha. 

  • Ah, interesting to note.  Haven't heard of something like that except the old .wmf but never knEW how to separate that.  It would just build slow on my old 'puter and I could see the underlayers.  :)

  • @Triem23

    I found a way! 

    In the export Persona , after you've converted all layers as "Slices", move the cursor up to the tab that says "Defaults" and enable "Convert Clips to Paths". That saves the position of each slice!

    While this is going to eat the crap out of my hard drive, I'm quite happy - although, I'm still begging FXHOME to sort out a PSD reading function.

    Hell, I'll be ecstatic if they partner up with affinity on this one. That would be a true release from Adobe machine. 





     

  • edited June 16

    Yes. For hobbyists who don't want to use solutions with good PSD support, that already cost $0, this will be a release.  But, almost everything on the planet supports PSD. Even consumer editors ingest PSD, and often EPS as well.

    You don't want your assets in Affinity's [also proprietary] format, for that reason.

    This stuff always sounds nice, when you are convinced you have found the solution for the ages. Then, when you have to use something else, it becomes a massive headache. 

    Will Title Studio open those files, for example, or will you be duplicating everything in different formats so different apps and plugins can open them?

    It just isn't worth, IMO. 

    Affinity competes with PS as well as PSP. Looks better, definitely performs better  But doesn't get you much closer to Adobe, if you are a PS power user.

    Audio is still going to be a big reason to stay on CC. HitFilm is not Audition and without OMF or AAF support it is hard to round trip to more appropriate tooling for Audio Post (or interchange with sound editors).

  • Triem23Triem23 Moderator

    @iNate there's a Hitfilm "cheat" to export WAV on Windows, but it sucks drive space. Set up an uncompressed AVI and turn off "Render Video" which speeds everything up... You end up with an all black file that's massive, but you can demux to extract/convert the audio to something else. That's 16-uncompressed PCM at 44.1 or 48 khz.

    Didn't say it was the best solution. But I like Hitfilm's Doppler tool better than other Doppler plug ins in other software, and sometimes wanna save the sounds.

    For format support, for Hitfilm purposes I would love Imerge to be able to render all masks/effects to the layers, save each layer as a lossless format (PNG/TIFF/EXR) then write a Hitfilm Composite Shot with the layers lined up and ready. It's another "propriety" pipeline yes, but it instantly makes Imerge more competitive to Hitfilm owners as a streamlined way to prepare matte paintings, 2.5 D compositing/pseudo depth, mograph layers, etc.

    Hitfilm is no Audition, but Audition is a dedicated audio program in a suite. Totally agree Hitfilm's audio is merely functional and missing Time/Pitch stretch (IMHO the last two "Essential" Effects I'd need for sound-shaping). But It's honestly better to only rough temp in the NLE and use dedicated audio editing to mix. Audacity is free. I use Vegas Pro, but I'm a long time Vegas user.

    Hitfilm's doppler shift is awesome  though. 

  • edited June 17

    Quote:  "there's a Hitfilm "cheat" to export WAV on Windows, but it sucks drive space. Set up an uncompressed AVI and turn off "Render Video" which speeds everything up... You end up with an all black file that's massive, but you can demux to extract/convert the audio to something else. That's 16-uncompressed PCM at 44.1 or 48 khz."

    This is not how it works in the real world.  When you do audio post production on video, you don't demux one huge rendered audio stream from the video - you generate an OMF, AAF, or XML (sometimes a translation tool like Vordio or AATranslator can be used) and the Sound Editor (or you) import this into a DAW Session.

    That way, the audio clips appear as they do on the NLE timeline, except you can use the DAW's superior tooling, plugins, and functionality to do his thing.  Also, sound editors have suites built to accommodate this type of work.  Video editors often don't, because that's not their focus.  A color calibrated display is more important to an editor or colorist than good reference monitors, and vice versa for sound editors.

    You can render out a video for timing (i.e. 720p DNxHD/XAVC-I/H.264 at low quality/bitrate).  DAWs can load the video, and then they work while using the video as a reference (i.e. for timing foley, SFX, etc.).

    Rendering out one WAV file is a completely unworkable workflow.  An sound mixer will laugh at you if you send them that.  It's useless.

  • edited June 25

    @iNate

    I'm fairly certain that @Triem23 is aware how the "Real World" works.

    Many of us Hitfilm users have worked for broadcast institutions, and know that the post-production ecosystem demands a seamless environment of different softwares.

    I myself have been a CC & Avid user for quite some time.  Albeit, more CC then AVID, since few companies seek to own the hardware required for the rest of the solutions their suites offer.

    The fact that we propose temporary solutions isn't the same as us claiming that they are ideal. We speak only for the independent users who cannot afford the means to purchase a bundled seamless post production package.  For me, because I budget my everyday life and get some jobs that don't require the "the entire treatment", I use these suites sparingly. 

    Also, by having these conversations, we facilitate the devs with ideas and feedback. And so far, for a team of only 7 people...I think it's fair where we are at the moment.

  • Triem23Triem23 Moderator

     @Hictor I worked in broadcast and, oh, yes, audio production for nigh 20 years. OMF and AAF are, of course, container formats that happen to hold LPCM audio - same codec as WAV files, and  of course, XML is a web markup language hacked for video/audio editing to point to the original media file. 

    Also, in the "real world," of course I wouldn't be editing audio in the NLE at all. I'd have two tracks to alternate camera audio for dialog  and another track or two for temp FX. Any REAL audio editing would be done directly in a DAW on the first-candidate "lock" edit where I'd be recording ADR directly into the DAW and be tossing most of the NLE audio, which was, basically  reference to begin with. 

    So, while dumping out audio from HF as AVI and demuxing the audio is a nasty, clunky workaround, but I could still dump each track quickly  demux quickly and I'll have two sample accurate tracks to dump on the DAW timeline. 

    Although one of my favorite audio "masters" was dumping out the entire movie to minidiscs, bouncing those "track discs" through some outboard gear for compression, spacial enhancement and a pass through a BBE to more minidiscs which were then redigitized back into my D+vision timeline. 

    Amazingly enough not a single frame of drift in the whole film and multiple bounces. This was before Pro Tools existed, so, whatcha gonna do? Janky-ass workaround, but it worked.

    @iNate in the real world I ain't gonna snob on file formats. Give me decent audio in any format and I'll make it work. The tool doesn't matter. The artist does. WAV has limitations as a container (limited Metadata, and a mere 4GB - 32 hours - of CD quality audio in a single file), but the "superior" formats are using the same LPCM codec from the 1980s.

    OMF, despite the "Open" in the name is a proprietary format that works with limited tools. AAF is also a proprietary format. WAV works with anything. WAV is also - wait for it - the BROADCAST STANDARD for BBC Radio and Australian Broadcast Company radio and the DRM standard for radio uses WAV for testing and quality standards.

    AAF, OMF and XML can make life easier, but ain't nothing wrong with a WAV.

    Also, why would I dump a WAV of the entire movie anyway? In the real world each  scene is a timeline later assembled into a main timeline, so, if I need to trim Sc 34 I don't risk shifting or de-syncing scenes 33 or 35.

  • edited June 25

    @Triem23 Case & point about making it work outside of file formats:

    Many news channels today will gladly take up your horizontal phone shakey footage, throw a 1080p blurred duplicate of it as a background and broadcast it, even on live TV.

    I would honestly throw the blame on Youtubers who continuously bloat the idea of "Need to have everything"

  • Triem23Triem23 Moderator

    @Hictor Avengers Endgame's VFX were rendered at, and the movie mastered at 2k. Why is YouTube pushing 4k, 60p again? 

  • @Triem23 I KNOW RIGHT?!

  • edited June 19

    Hictor
    In the export Persona , after you've converted all layers as "Slices", move the cursor up to the tab that says "Defaults" and enable "Convert Clips to Paths". That saves the position of each slice!

    I tried your method. Each slice is getting exported as a PNG file but when I bring it into HF, they are all centered on top of each other.

    I am bringing in the individual PNGs and dropping them into a comp.
    Is that what you were doing?

    Thanks.
    Tom

  • edited June 19

    It isn't about making a different format work - AIFF works well, not just Wave.  OMF works, not just AAF.  Ifyou can spit out an XML that I can import and generate in other software, that would also work - those are actual workarounds.

    It's about handing someone a 1+ hour long stereo audio file *muxxed out of an AVI video and telling them to get their work done.

    That's not snubbing.  That's just the realization that the proposed solution is a complete non-starter.  Excuse me for saying that's not how the real world works when you make that suggestion with any level of seriousness...

    -----

    60FPS is needed because different types of content calls for different solutions. Avengers is a feature film, not the Winter Olympic Games.  YouTube streams all types of footage.  People who create content also create all types of footage.  60FPS is there because it's necessary, 4K is there because things move forward. That is where the future is, even though a lot of stuff is still done in HD.

    Additionally, many people don't like watching low res footage on big, high resolution displays.  In many cases, it looks awful.  Some are used to this, because so much is still in HD.  However, there is a marked visual difference between something mastered at 2K and something mastered at 4K on a 50" UHD screen.  The 4K looks much better, the same way HDR media on an HDR screen looks better than SDR media.

    They're pushing it because they're forward thinking and looking towards the future, not just sitting in the here and now and thinking "things are good now, we don't really need anything better."

  • edited June 20

    @iNate You completely missed out on @Triem23 ; point.

    Nothing in what he stated denies the advantages of the formats you speak of. At the same time, he acknowledges their limitations of  them being proprietary while WAV is generally well recieved by all post production softwares and premiere broadcast agencies, in spite of better formats. 

    As an editor, in the real world, you do not have the luxury to always choose what format you want. 

    A client (company/broadcast/indie director) assigns you to edit something and you're paid to do it, he will hand the material at his disposal.You will not always have the luxury to pick and choose what format you edit..Whether it was pumped out of Hitfilm from an AVI file, captured from an ancient cassette tape or whatever outdated format he provides. He assigns you a job, you get the job done  It's what you're paid for.It's how the industry works. You make the best with what you get and you end up paying rent.

    Unless Triem is given SD footage and is asked to scale it up to a 4k crisp cinematic masterpiece (a scenario in which a professional simply explains that it cannot be done), he has no reason to be picky about formats.

    With regards to Youtube's 4k 60 fps,  none of us here are denying this. In fact, in terms of "thinking ahead",  8k monitors have already made it to many high-end showrooms. Some phone manufacturers have already released specs for 8k Footage in the future.

    The issue we take is how it has become a major influencing factor for upcoming generations of filmers or indeed aspiring youtubers that they need to begin with the bar being so high at the top, or that the consuming market will only pay to see as such.

    He brings Avengers as a spectacular example as to how, despite its ginormous budget and prestige, does not need to subscribe to these specs.


  • @Tkane18

    Not sure what the hell happened, but I think the updates might have ruined the workflow.

    The slices I was exporting kept the overall resolution of the canvas intact and indeed the position it was placed in.

    Now it's just not working at all.

    I'll speak with support. 

  • edited June 25

    Quote: "As an editor, in the real world, you do not have the luxury to always choose what format you want."

    Correct:  Which is why you use an NLE that supports the necessary CODECs and interchange formats - basically... CYOA.  You don't base your business on an NLE that can't accept certain widely-used CODECs, or a DAW that cannot accept AAF or OMF, when you know you have to send projects from NLE to DAW on the regular.  Or, when you know your clients will hand you HDDs with media recorded straight to ProRes, DNxHD/R, or XAVC-I to ingest.

    No one in this business sends anything off to Audio with nothing but stems rendered out of an NLE.  Don't even attempt to pretend this is normal, or something any professional in this industry would consider acceptable.  You cannot properly mix and master those stems.  The stems get sent from Audio to Picture, not the other way around... after the Audio guys work their magic on the sound.  This is completely backwards.

    The editor's inability to spit out an AAF or OMF (or at least an XML) is their own personal problem, and not someone else's problem to fix.

    This is something you expect grandma to do; if she thinks her audio is bad, and wants you to "fix" it for her... not any professional editor.

    HitFilm is not the choice of professionals.  I get it.  I just thought adding this capability would be a good way to getting it going in that direction.  Considering the half-broken state of VEGAS' interchange option, the door is sort of open, there.

    Whatever.

Sign In or Register to comment.