
In an ideal world you and I should be able to slap any old collection of mixed assets on the timeline of our video editing software and life should just be excellent from that point forward!
Unfortunately you and I don’t actually live in an ideal world and in fact, when it comes to video editing, it is a world that is way less than ideal!
So just to clarify, mixed assets or mixed footage or even mismatched footage refers to a situation where you have video footage captured at varying specifications added to your project timeline.
Those differences could be in resolution, frame rate, bitrate or file types.
The reason doing this becomes problematic basically comes down to the complexity of editing videos in general and the way editing software has to try and deal with those complexities.
So at the core of the problem is the fact that in order to deal with video files of any kind, modern video editing software has to be calibrated to what could best be described as baseline set of specifications.
These settings are contained within the Project Settings of most video editing programs and from that point forward everything added is either too much, not enough or just right.
The “too much’s” and “not enough’s” are marked for handling of some kind in order to be made “just right!”
Sometimes those project settings are created when the first video is added to the timeline so the software assumes that the first clip added will be the baseline setting.
Most software already has a default project setting and expects you to be adjusting that according to what you will be working with.
So if your project is 4K at 30fps the software now knows what the height and width is going to be as well as the rate (time) at which individual images are going to be displayed to create the video.
Anything different to those baseline settings can be dealt with by either changing the size or compensating for different frame rate.
These two things affect the final output depending on how well the software can handle increasing or decreasing the values of assets to be included in the video you are working on.
Similarly it depends on how well the software can eliminate or add frames for any footage that is at a different frame rate.
Finally if you add footage that was created using a different codec, for example some MP4 footage mixed with .AVI footage then the software now has to activate two separate codecs at the same time in order to deal with your project.
Bear in mind that editing software cannot export a final video file at variable resolution or frame rate.
So unlike other computing tasks, video editing even slightly outside of routine actions, such as the one we are talking about here, requires at least some technical understanding of this to effectively manage the situation.
Understanding Mixed Assets in Video Editing
Footage Specifications
The first thing you really need to know from the outset are the exact specifications of the footage your dealing with.
How it was recorded, its resolution, frame rate and format are all essential for predicting its behavior on the timeline.
Using that information you can align the project settings with these specifications in mind to avoid issues such as pixelation, audio-sync problems or poor representation of motion.
Frame Rates
Frame rate indicates the number of frames displayed per second and as such is a representation of time and motion.
Beyond the actual frames per second there is also embedded within any digital video file a time code that controls the rate at which the video displays.
Variations in frame rates within a project requires a process called either frame interpolation or frame decimation depending on the situation.
If the frame rate of the footage is higher than the project frame rate then the the number of frames has to be reduced – decimated.
If the frame rate is lower than the project frame rate then the number of frames has to be increased – interpolated.
This very often results in a degraded visual experience especially when it comes to reproducing smooth motion.
Decimating frames can make the motion appear slightly jumpy or jarring to watch while interpolation usually results in a strange or even unrealistic representation of motion.
Resolution
Resolution determines the number of pixels in an image, determining its clarity and quality.
Expressed as width-by-height in pixels (e.g. 3840px x 2160px for 4K), resolution affects how footage is scaled within a project timeline and in preview.
So on a 1080p project timeline, any 4K footage has to be scaled down which is far more preferable to scaling 1080p footage up to a 4K project timeline without some kind of A.I. intervention.
The best way to conceptualize this is to understand that a 4K image of one frame has four times the amount of pixels that a 1080p image frame has.
When the editing software is converting 4K down to 1080p it uses the information from four pixels and combines them into one.
Because it has lots of data to do that usually the quality is maintained.
On the other hand, for the software to scale 1080p up to 4K it has to take each individual pixel and produce four pixels for each.
It has no real way of knowing how to do that accurately and almost invariably leads to image quality degradation.
Mixed Codecs
It may occur that you find yourself having to deal with footage that has been created by different devices and as a result different codecs were used to create that footage.
It could be a totally different codec such as .AVI and H.264 or it could even be as subtle as two different iterations of the H.264 codec.
Remember, even though the H.264 codec is probably the most common one you will encounter, there is no strict standard that all H.264 codecs must adhere to.
H.264 or H.265 footage from a DJI drone device is going to be vastly different from the equivalent footage from an MILC camera shooting video.
From a quality standpoint this is usually not much of a problem but it does require that the software you are using has to load two or more different codecs into play in order to work on the project.
If you are having glitches or freezes while you are working like this it will usually be because your computer is having trouble keeping up with the calls to the other codec.
Best Practices for Managing Mixed Assets
So that’s the basic situation with mixing assets on the timeline within a project so let’s get into how to manage this effectively.
Consistent Project Framework
The first thing to do is to plan out how you can maintain consistent video file parameters from shooting through to rendering your final file.
Capture all primary footage (A-roll) at the same resolution and frame rate and because this will be the majority of what is going to be on the timeline, set the project settings to match these specifications.
For B-roll and other secondary footage, try to capture or use stock footage that matches the project’s main settings.
If you cannot find footage at the same resolution or frame rate remember it is best to get footage at a higher resolution and take it down rather than the other way around.
If you are forced into a situation where you simply cannot match frame rates, test the secondary footage in your video editing software and see if it handles it OK.
Often short clips occupy such short actual screen time the difference may not be noticable.
Finally, if you are absolutely stuck with footage at either a different resolution or a different frame rate that you really want to use then the final solution may involve third party specialist software or even A.I.
Handling Variable Frame Rates
Mobile devices, due to power, processor and space restrictions, often record with variable frame rates, causing syncing issues during editing.
Rather than placing the pressure on your software during the actual editing process it is always better to get that footage to a constant frame rate before you add it.
There are two ways you can do this.
The first is to just start a new project, set the project settings at the correct resolution so that there is no increasing or decreasing of actual image size then render a new file at a constant frame rate.
If the results are fine then you can add it to your project.
If not you can try tools like Handbrake (free) to transcode variable frame rate (VFR) footage into a constant frame rate format, mitigating potential editing difficulties.
Try To Get Your Ducks in a Row
Probably the most effective strategy for dealing with mixed assets is to standardize all of your footage as best you can before you even add it to the library of your project.
If you have resolutions that need to be increased, do that separately in your editing software to see what the result are like.
If it looks OK fine, if not then you may have to go some kind of A.I. upscaling solution to get it in shape.
Either way treat it as a separate action so that your editing software and computer are not trying to deal with all that processing on the fly.
If your frame rates are off or variable then standardize them before you add then to a project.
This will result in way less stress on your editing computer and will enourmously cut down on the chances of crashes or glitches in the process because it all got a bit hard!
Export and Compression – A Note on YouTube
When you upload a video to just about any online platform you don’t control like YouTube or Instagram etc., be aware that your footage will be re-processed.
They do this for a number of reasons but the two main ones are:
- A standardized set of video parameters across the platform to simply things for them.
- Having higher to lower resolution versions of each video to enable a smooth streaming experience.
Just because you uploaded at 4K does not mean everyone is going to see your video at 4K.
The version they will see will depend on the speed of their internet connection and size of their display screen.
Having said that there is one unseen advantage to uploading at 4K.
This will trigger the use by YouTube of more recent and more advanced re-rendering models for your video and will maintin a higher level of quality for the lower resolution versions.
Audio Sync Issues – A Note on MP3
Changes in frame rates can lead to audio drift, where the audio goes slowly out of sync with visuals as the video progesses as I have already mentioned.
In addition to this direct result audio drift can also occur when the computer you are using is being pushed too hard to keep up with everything it has to do.
Too many resource heavy processes being called for at the same time can cause this to happen.
For example let’s say you have a cut from one clip to the next with a cross-fade transition in the middle.
That in itself is a pretty intensive action but then let’s say that those two clips are at different resolutions and different frame rates as well.
You win no prizes for guessing that there is a very good chance something is going to go wrong at that point!
Aside from that there is one other potential problem that is confined only to using MP3 audio files.
Usually when you add an audio file to the timeline it will play back at the speed determined by the editing software.
However in the case of certain (not all) MP3 files they have their own embedded time code which sometimes causes a conflict with the time code the software is working on.
The result will be out of sync audio or audio that slowly drifts out of sync over time.
The solution to this is to pre-process MP3 files through something like Handbrake to get them standardized before using them.
Software Resource Usage
Any upscaling or complex frame interpolations will consume significant system resources, potentially slowing down workflow.
Use these tools sparingly and prioritize capturing footage at desired resolutions and frame rates or pre-process these files before using them or as I said earlier, pre-process the footage separately.
A Word on Bitrates
The bitrate is the amount of digital information being used to represent each frame in a video and has direct bearing on the quality of the image you see.
Additionally when the bitrate is very high you have the ability to successfully apply more aggressive color correction and manipulation to the the video you are working on while maintaining quality.
Low bitrate footage cannot be adjusted too much or have effects applied without it all falling over!
The downside is that at very high bitrates you are dealing with files that are very large while the sheer amount of data can place a great deal of pressure on your editing computer.
The general rules are that you never want to try to increase the bitrate above what the files you have are already recorded at.
Essentially you are trying to make something out of nothing and there is no great advantage.
Conversely you can reduce the bitrate to bring your file sizes down and most editing software is pretty good at it.
Work out what you computer can handle and try to maintain the highest bitrate you can throughout the editing process.
Conclusion
Managing mixed resolutions and frame rates is a handy skill in video editing, requiring attention to detail and knowledge of editing software capabilities.
By maintaining consistency in footage specifications, employing advanced software features wisely and understanding the implications of mixed assets you can deliver high-quality content.
Resources
Handbrake
For standardizing frame rates, file parameters and file types.
Nero Video Upscaler AI
For upscaling content to higher resolutions with no loss of quality.
Discover more from The DIY Video Editor
Subscribe to get the latest posts sent to your email.
Leave a Reply