When shopping for a TV or computer monitor, it’s easy to get overwhelmed by terms like progressive scanning, 4K Ultra HD, frame rates, and refresh rates. While those last two may seem the same, there’s a subtle difference between the two. That’s why we’ve put together a guide to the differences between refresh rate and FPS.
Like traditional film, digital video displays images as separate frames. Frame rate refers to the number of frames per second (FPS) that a television can display. These frames are displayed using either the interlaced scan method or the progressive scan method. Frame rates are often listed next to the video resolution. For example, 1080p/60 TV has a frame rate of 60 FPS.
TV manufacturers have introduced a number of features to improve frame rates. For example, some TVs use a technique called frame interpolation, in which the video processor combines elements from successive frames, stitching them together for smoother motion. The downside of this effect is that movies shot on film can appear as if they were shot on digital video.
Because the film was shot at 24 frames per second, the original 24 frames had to be converted in order to be displayed on a typical television screen. However, with the introduction of Blu-ray Disc and HD DVD players that can output a video signal at 24 frames per second, new refresh rates have been implemented to process these signals in the correct mathematical relationship.