What Framerate is Most TV?

Hi there! Frame rate, or frames per second (fps), is one of the most important factors that determines the viewing experience on your television. In this guide, I‘ll explain the key frame rates used for TVs and video, how they differ, and help you choose the optimal TV frame rate for your needs.

Why Frame Rate Matters

The frame rate defines how many still images (frames) are displayed per second to create the illusion of smooth, continuous motion. The higher the frame rate, the smoother and sharper the video, especially for scenes with fast motion like sports.

However, higher frame rates require more bandwidth and processing power. So there are trade-offs depending on the use case. Let‘s look at the most common frame rates and when they‘re used:

24fps – The Film Standard

  • 24fps has been the standard frame rate for cinema since the 1920s. This was the lowest frame rate that could produce acceptable continuous motion.

  • At slower rates, motion starts to look jerky and unnatural. This is because the gap between frames becomes visible to the human eye.

  • The 24fps film standard creates appealing motion blur during fast movement. This gives footage a cinematic, movie-like look.

  • Almost all filmed content like movies and scripted shows use 24fps. It‘s ubiquitous in Hollywood and television.

  • 24fps provides a filmic look but isn‘t always optimal for live TV or gaming where smoother motion is preferred.

30fps – Live TV Standard

  • 30fps emerged as the standard for interlaced NTSC TV broadcasts in North America and Japan.

  • It was slightly faster than 24fps but still conserved bandwidth over the airwaves.

  • 30fps is still common today for live TV like news, talk shows, game shows, awards programs, etc. The extra 6fps makes motion smoother than 24fps while retaining a cinematic feel.

  • 30fps provides the best balance between quality and bandwidth for streaming too. Most online services like YouTube, Twitch, and Facebook target 30fps.

  • One downside is 30fps content can display judder when panning on 60Hz TVs. This can be corrected by pulldown techniques.

60fps – Smoother Motion and Action

  • As bandwidth and display capabilities improved, 60fps became popular for sports and video games where smoothness is key.

  • At 60fps, motion looks extremely fluid and natural to the human eye, with minimal blurring.

  • Slow motion capabilities also improve hugely. 60fps allows slo-mo replays at 50% speed while retaining sharpness.

  • Video game consoles and gaming PCs target 60fps or higher today. It reduces input lag between controls and on-screen response.

  • However, 60fps provides a different look from cinema. Without motion blur, some viewers feel footage looks "too real" or like a videotape rather than a movie.

Beyond 60fps – The Cutting Edge

Higher frame rates beyond 60fps remain less common but provide an extraordinarily smooth experience:

  • 120fps – Supported on high-end gaming monitors and some TVs. Enables sharper slo-mo and hyper-realistic motion for fast gameplay. Input lag becomes nearly imperceptible. Still rare in video outside tech demos.

  • 144fps – The refresh rate for 1440p gaming monitors. Matches high frame rate gameplay from gaming PCs. Requires less bandwidth than 4K/120fps while still enabling fluid visuals.

  • 240fps – Supported on some newer TVs oriented towards gaming like the Sony X900H. Provides the maximum motion clarity but diminishing returns over 120fps.

Now that we‘ve compared the most popular frame rates, let‘s examine how they are used based on video source and display equipment.

Frame Rates by Video Source

Movies – 24fps
TV shows – 24fps or 30fps

News/live events – 30fps
Soap operas – 30fps or 60fps
Sports – 30fps or 60fps
Video games – 30fps to 240fps
Web video – 24fps, 30fps or 60fps

Frame rate needs differ based on the video content:

  • Filmed media like movies and shows target 24fps or 30fps to mimic cinema. Higher rates aren‘t needed for mostly static scenes.

  • Live TV uses 30fps or 60fps. Higher frame rates make unpredictable motion in live events smoother.

  • Sports and gaming opt for 60fps or beyond to eliminate blur and input lag completely.

  • Web video quality depends on the platform and viewing device. YouTube and Twitch often stream at 30fps or 60fps.

Next let‘s see how display equipment affects frame rate capabilities:

Frame Rates by Display

Standard HD TVs 60Hz (Supports 30fps video)
1080p HDTVs 60Hz or 120Hz (Supports 60fps)
4K TVs 60Hz, 120Hz, or 240Hz (Supports up to 240fps)
8K TVs 120Hz or 240Hz (Supports up to 240fps)

The refresh rate of the display determines which video frame rates it can handle:

  • Older HD TVs were 60Hz, matching the 30fps NTSC standard.

  • 1080p HDTVs introduced 120Hz, allowing playback of 60fps content.

  • Newer 4K and 8K TVs support 120Hz or 240Hz modes. Enabling ultra smooth 120fps or 240fps gaming.

But keep in mind, a high refresh rate TV alone isn‘t enough. The video source must provide high frame rate output via a modern HDMI or DisplayPort interface.

Choosing the Right Frame Rate

When buying a new TV, choose a display frame rate that matches the frame rate of your desired content sources:

  • Movies/streaming – A 60Hz TV is fine here. All filmed content is 24p or 30p.

  • Sports and gaming – Look for 120Hz or 240Hz modes to enable smooth 60fps+ video. Make sure your sources output the required frame rates.

  • Web video – A 60Hz display is adequate. YouTube and Twitch currently top out at 60fps for the most part.

While higher frame rates over 60fps deliver diminishing returns, they do provide a competitive edge for fast esports gaming. For a cinematic viewing experience, the classic 24p or 30p standards still hold up very well.

Hopefully this guide has helped explain the key frame rates used for different types of TV and video content today. Let me know if you have any other questions!

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.