Here’s pretty much everything you need to know about frames rates when it comes to both shooting and editing videos.
Fortunately these days the whole argument for various frame rates being used for different purposes has settled down into an uneasy truce.
Having said that it is still a subject worth understanding and getting right every time.
The Evolution of Frame Rates
Before delving into the technical aspects of frame rates, it’s essential to understand their historical context and significance.
Early Cinema and the Emergence of Frame Rates
The idea of frame rates originated from the basic principle of creating the illusion of motion by displaying a sequence of still images in rapid succession.
In the late 19th century filmmakers and inventors sought to comprehend how humans perceive movement and then replicate that perception realistically.
In other words, how many images per second would it take to believably convey motion to the human eye.
A significant advancement occurred with the invention of the kinetoscope by Thomas Edison and William Kennedy Dickson in the 1890s.
While not a projector in the modern sense, this device introduced the concept of a film strip displaying consecutive images to create the illusion of motion.
This innovation ultimately led to the development of film as we understand it today.
The Silent Film Era Standard
By the early 1900s, with the rise of silent films the first versions of frame rates emerged.
These early films typically operated at around 16-18 frames per second (fps) which was a rate determined by the hand-cranked cameras of the time.
This frame rate was deemed sufficient to create a smooth motion illusion while being economical in terms of film stock usage.
The Transition to Sound and the Adoption of 24 fps
The late 1920s saw the introduction of synchronized sound which, along with the advent of motor-driven cameras, significantly altered frame rate standards.
The integration of sound into films necessitated a more rigidly consistent frame rate to ensure audio synchronization.
This was especially important for voices in that the variable recording and playback speeds of the old hand cranked footage would result in strange pitches and tones in speaking voices.
In 1928, the film industry established 24 fps as the standard for sound films.
Technical Considerations for 24 fps
The selection of 24 fps was a balanced decision aimed at:
- Ensuring an adequate frame rate for smooth audio reproduction using the audio technology of the time.
- Reducing film stock costs.
- Achieving a visually appealing motion effect.
This standard was largely influenced by the mechanical limitations of film cameras and sound recording technology of that era.
Television and Frame Rate Adaptation
The transition of frame rates to television introduced additional complexities, as different regions developed distinct standards based on their electrical systems.
NTSC (North America and Japan at 60Hz power)
The standard of 30 fps (technically 29.97 fps) was adopted to accommodate color television technologies.
Initially, the frame rate was 30 fps for black and white broadcasts, but the need to incorporate color information necessitated a slight reduction to 29.97 fps, allowing existing systems to remain unchanged.
PAL (Europe and many other regions at 50Hz power)
A frame rate of 25 fps was developed as an enhancement over NTSC offering improved color reproduction.
Although this frame rate is slightly slower, it is considered more stable.
The Interlacing Solution
Both NTSC and PAL utilized interlaced scanning, where each frame was divided into two fields further contributing to the evolution of frame rates in visual media.
This technique effectively doubled the perceived frame rate and reduced flicker allowing for smoother motion at lower actual frame rates.
Interlaced scanning was based on the duration that a phosphorescent point on the screen remained illuminated after being struck by the electron gun located at the base of the screen.
Digital Video and Contemporary Frame Rates
The digital revolution has significantly expanded the possibilities for frame rates.
Here’s a video exploring a little on frame rates in general but more importantly some of the effects you can achieve by mastering the subject.
Digital Standards in Cinema
24 frames per second has remained the standard for cinema since 1927 following the successful introduction and subsequent dominance of the Warner Brothers Vitaphone system, which added sound to film.
As previously mentioned there were three primary reasons why 24 frames per second became the industry standard.
First, shooting a movie at 24 frames per second made the film stock a manageable expense for the production.
Second, the Vitaphone engineers were not considering frames per second; instead, they focused on feet per minute of audio tape.
90 feet per minute is equivalent to 24 frames per second.
And finally, more by good luck than good fortune, 24fps captures motion in a way that most accurately mimics how the human eye perceives movement.
It provides the closest approximation to what is known as “motion blur.”
Motion blur is the subtle blurring effect that occurs when we perceive motion in real life.
Without this effect, our visual experience can appear artificial.
Digital Video Standards
Before we got too fat into the next part of this article relating to frame rates and bitrates here’s a video to give you a better understanding of what they are and how they apply to your video projects.
Currently, we have established a relatively new set of frame rate standards that reflect changes in the digital landscape and the equipment at our disposal.
First of all, interlacing is no longer part of any standard as modern screens no longer use the old Cathode Ray Tube technology.
All video these days is captured and presented as “Progressive,” meaning that each image in the sequence is presented in its entirety.
30 frames per second has become the standard for web video and broadcast content intended for consumption on digital devices.
60 frames per second is becoming increasingly common for capturing and presenting gaming, sports, and high-motion content.
24 fps has maintained its status as the preferred frame rate for achieving a “cinematic” look.
Higher frame rates such as 60fps all the way up to 240fps for footage that will be used for slow motion or even extreme slow-motion sequences whilst retaining image quality.
The evolution of frame rates has been a continuous process governed by technological limitations, economics, artistic expression and human visual perception.
From hand-cranked cameras to digital 8K systems, frame rates have been a critical element in how we capture and experience moving images.
Video Editing and Frame Rates
When it comes to video editing, adjusting frame rates and bitrates can greatly influence the quality, file size and overall viewing experience of your content.
Here’s a closer look at how each factor plays a role:
Frame Rate Considerations
Visual Smoothness
Higher frame rates, such as 60 fps compared to 30 fps, result in smoother motion, which is particularly beneficial for fast-paced scenes found in sports, gaming, and action films.
However as the frame rate increases the presence of natural looking motion blur decreases giving a sort of artificial appearance to the shots.
So if you are shooting video that does not contain an unusually high level of motion then 30fps will give more realistic footage.
If you are shooting high action shots then the higher frame rates will capture that action better and more realistically.
File Size and Performance
Increasing the frame rate leads to more data being processed per second, which can result in larger file sizes and higher demands on processing power during both editing and playback.
Artistic and Practical Aspects:
- 24 fps is commonly used for a cinematic film appearance.
- 30 fps is the standard for most broadcast and online videos.
- 60 fps offers ultra-smooth motion for high-action sequences.
- Frame rates of 120 fps or higher allow for dramatic slow-motion effects in post-production to be applied with no loss of image quality.
Bitrate Considerations
Video Quality
The bitrate has a direct impact on the image quality and compression of the video.
Higher bitrates retain more detail and minimize compression artifacts, while lower bitrates can lead to visible compression, blocky images, and loss of fine details.
File Size
- Higher bitrates result in larger file sizes.
- Lower bitrates yield smaller, more compressed files.
- A typical bitrate range is 5-20 Mbps for standard video and 20-50 Mbps for high-quality content.
Compression Efficiency
Modern codecs like H.264 and H.265 can deliver better quality at lower bitrates.
It’s essential to find the right balance between codec choice and bitrate for optimal results.
Practical Recommendations
- For web and streaming, aim for 1080p at 30 fps with a bitrate of 5-10 Mbps.
- For high-quality productions, consider 4K at 60 fps with a bitrate of 20-50 Mbps.
- For slow-motion effects, higher frame rates (60 fps and above) are advisable.
Ultimately, the ideal settings will depend on your specific project, the platform you’re targeting and your quality requirements.
Experimenting with various combinations will help you achieve the best balance between quality, file size, and performance.
Key Video Processing Guidelines
- Maintain the same frame rate throughout the entire process from shooting to render.
For instance, if you record at 30 fps ensure your project settings and final output are also set to 30 fps.
Converting 30 fps footage to 24 fps can often result in an unnatural appearance, particularly in motion.
- Capture at the highest quality and maintain that quality until the final render.
Each time you re-render a digital video file, some degradation may occur, affecting both bitrate and resolution.
Therefore, it’s best to preserve the highest quality throughout the editing process.
- You Can’t Create Something from Nothing – Resolution
Most video editing software offers the option to “upscale” your video.
For instance, you can import 1080p footage into a 1080p project and then render it in 4K without any issues.
However, this process essentially enlarges the images similar to inflating a balloon which can lead to a decrease in image quality.
The software cannot generate new details from existing ones.
To achieve effective upscaling without compromising image quality it is essential to use specialized, AI-driven software that can intelligently interpret and enhance the footage.
- You Can’t Create Something From Nothing – Bitrate
Similarly increasing the bitrate of a project above the rate at which it was captured will effectively do nothing other than make the final file size way bigger.
There will be no improvement in quality in any aspect of your video.
- YouTube and Other Online Platforms
Regardless of what quality you upload your videos to these platforms they will always re-encode your files to their standards and parameters.
They do this automatically to make sure that every video on the platform conforms to their own requirements.
If you upload a 4k video with the exact parameters they require using the exact codec they ask for and at the exact bitrate they consider ideal… they will still re-encode it!
The 4k version will be re-encoded to 4k (their version) and at the same time copies will be made at 1080p, 720p, 480p, 360p and 240p resolutions.
There are a number of reasons for this but the main one is that they need a selection of resolutions to be available so that they can smoothly stream the video depending on the viewer’s internet speed and device screen size.
So, the rule is to upload at the highest possible resolution you can based on the original captured footage, the full bitrate at which it was captured and the original frame rate.
This is the only way to minimize any degradation as a result of their re-encoding.
If you have any questions feel free to use the Contact button and I am always happy to help.
Discover more from The DIY Video Editor
Subscribe to get the latest posts sent to your email.
Leave a Reply