Refresh rate

The refresh rate (most commonly the "vertical refresh rate", "vertical scan rate" for cathode ray tubes) is the number of times in a second that a display hardware updates its buffer. This is distinct from the measure of frame rate in that the refresh rate includes the repeated drawing of identical frames, while frame rate measures how often a video source can feed an entire frame of new data to a display.

For example, most movie projectors advance from one frame to the next one 24 times each second. But each frame is illuminated two or three times before the next frame is projected using a shutter in front of its lamp. As a result, the movie projector runs at 24 frames per second, but has a 48 or 72 Hz refresh rate.

On cathode ray tube (CRT) displays, increasing the refresh rate decreases flickering, thereby reducing eye strain. However, if a refresh rate is specified that is beyond what is recommended for the display, damage to the display can occur.[1]

For computer programs or telemetry, the term is also applied to how frequently a datum is updated with a new external value from another source (for example; a shared public spreadsheet or hardware feed).

Cathode ray tubes

Electron beam in the process of scanning an image

In a CRT, the scan rate is controlled by the vertical blanking signal generated by the video controller, ordering the monitor to position the beam at the upper left corner of the raster, ready to paint another frame. It is limited by the monitor's maximum horizontal scan rate and the resolution, since higher resolution means more scan lines.

The refresh rate can be calculated from the horizontal scan rate by dividing the scanning frequency by the number of horizontal lines multiplied by 1.05 (since about 5% of the time it takes to scan the screen is spent moving the electron beam back to the top). For instance, a monitor with a horizontal scanning frequency of 96 kHz at a resolution of 1280 × 1024 results in a refresh rate of 96,000 ÷ (1024 × 1.05) ≈ 89 Hz (rounded down).

CRT refresh rates have historically been an important factor in electronic game programming. Traditionally, one of the principles of video/computer game programming is to avoid altering the computer's video buffer except during the vertical retrace. This is necessary to prevent flickering graphics (caused by altering the picture in mid-frame) or screen tearing (caused by altering the graphics faster than the electron beam can render the picture). Some video game consoles such as the Famicom/Nintendo Entertainment System did not allow any graphics changes except during the retrace (the period when the electron guns shut off and return to the upper left corner of the screen).

Contrary to popular belief, liquid-crystal displays (LCDs) do suffer from flickering problems. It is still necessary to avoid modifying graphics data except during the retrace phase to prevent tearing from an image that is rendered faster than the display operates (LCDs normally always refresh at 60 Hz).

CRTs have the unique ability to use light guns and pens. These are devices with a photosensor that detects the electron beam and sends a signal to the attached computer. This can be used to determine if a specific graphics object is on the screen. The light gun is a larger variant used in arcade games and some consoles. Unlike light pens, they are held at a distance from the screen.

Light pens and guns cannot be used on fixed-pixel displays because they have no electron beam to detect. Pen tablets and touchscreen LCDs are used as a substitute for them, but the latter require a specially-designed LCD panel and are mostly only found in point-of-service monitors. The Nintendo DS is an example of a video game system that has a touchscreen LCD.

Liquid-crystal displays

Refresh rate or the temporal resolution of an LCD is the number of times per second in which the display draws the data it is being given. Since activated LCD pixels do not flash on/off between frames, LCD monitors exhibit no refresh-induced flicker, no matter how low the refresh rate. However, high refresh rates may result in visual artifacts that distort the image in unpleasant ways. High-end LCD televisions now feature up to 600 Hz refresh rate, which requires advanced digital processing to insert additional interpolated frames between the real images to smooth the image motion. Such high refresh rates may not be supported by pixel response times, resulting in distorted images.

For a refresh rate of 600 Hz to be displayed correctly, an LCD would require a response time of approximately 1.667 (53) milliseconds GtG (grey-to-grey). In addition to the technical aspects of achieving such a high refresh rate, there are limits to the capability of the human eye. However, improving the response time of LCD pixels would improve the image quality for refresh rates that are on the fringe of what the human eye is capable of processing.

Computer displays

On smaller CRT monitors (up to about 15 in or 38 cm), few people notice any discomfort between 6072 Hz. On larger CRT monitors (17 in or 43 cm or larger), most people experience mild discomfort unless the refresh is set to 72 Hz or higher. A rate of 100 Hz is comfortable at almost any size. However, this does not apply to LCD monitors. The closest equivalent to a refresh rate on an LCD monitor is its frame rate, which is often locked at 60 fps. But this is rarely a problem, because the only part of an LCD monitor that could produce CRT-like flicker—its backlight (if fluorescent; LEDs have no flicker)—typically operates at around 200 Hz.

Different operating systems set the default refresh rate differently. Microsoft Windows 95 and Windows 98 (First and Second Editions) set the refresh rate to the highest rate that they believe the display supports. Windows NT-based operating systems, such as Windows 2000 and its descendants Windows XP, Windows Vista and Windows 7, set the default refresh rate to a conservative rate, usually 60 Hz. The many variations of Linux usually set a refresh rate chosen by the user during setup of the display manager (although a default option is usually included with xfree86). Some fullscreen applications, including many games, now allow the user to reconfigure the refresh rate before entering fullscreen mode, but most default to a conservative resolution and refresh rate and let you increase the settings in the options.

Old monitors could be damaged if a user set the video card to a refresh rate higher than the highest rate supported by the monitor. Some models of monitors display a notice that the video signal uses an unsupported refresh rate.

Dynamic refresh rate

Some LCDs support adapting their refresh rate to the current frame rate delivered by the graphics card. Two technologies that allow this are FreeSync and G-Sync.

Stereo displays

When LCD shutter glasses are used for stereo 3D displays, the effective refresh rate is halved, because each eye needs a separate picture. For this reason, it is usually recommended to use a display capable of at least 120 Hz, because divided in half this rate is again 60 Hz. Higher refresh rates result in greater image stability, for example 72 Hz non-stereo is 144 Hz stereo, and 90 Hz non-stereo is 180 Hz stereo. Unfortunately most computer graphics cards and monitors cannot handle these high refresh rates, especially at higher resolutions.

For LCD monitors the pixel brightness changes are much slower than CRT or plasma phosphors. Typically LCD pixel brightness changes are faster when voltage is applied than when voltage is removed, resulting in an asymmetric pixel response time. With 3D shutter glasses this can result in a blurry smearing of the display and poor depth perception, due to the previous image frame not fading to black fast enough as the next frame is drawn.

Televisions

The development of televisions in the 1930s was determined by a number of technical limitations. The AC power line frequency was used for the vertical refresh rate for two reasons. The first reason was that the television's vacuum tube was susceptible to interference from the unit's power supply, including residual ripple. This could cause drifting horizontal bars (hum bars). Using the same frequency reduced this, and made interference static on the screen and therefore less obtrusive. The second reason was that television studios would use AC lamps, filming at a different frequency would cause strobing.[2][3][4] Thus producers had little choice but to run sets at 60 Hz in America, and 50 Hz in Europe. These rates formed the basis for the sets used today: 60 Hz System M (almost always used with NTSC color coding) and 50 Hz System B/G (almost always used with PAL or SECAM color coding). This accident of chance gave European sets higher resolution, in exchange for lower frame-rates. Compare System M (704 × 480 at 30i) and System B/G (704 × 576 at 25i). However, the lower refresh rate of 50 Hz introduces more flicker, so sets that use digital technology to double the refresh rate to 100 Hz are now very popular. (see Broadcast television systems)

Another difference between 50 Hz and 60 Hz standards is the way motion pictures (film sources as opposed to video camera sources) are transferred or presented. 35 mm film is typically shot at 24 frames per second (fps). For PAL 50 Hz this allows film sources to be easily transferred by accelerating the film by 4%. The resulting picture is therefore smooth, however, there is a small shift in the pitch of the audio. NTSC sets display both 24 fps and 25 fps material without any speed shifting by using a technique called 3:2 pulldown, but at the expense of introducing unsmooth playback in the form of telecine judder.

Similar to some computer monitors and some DVDs, analog television systems use interlace, which decreases the apparent flicker by painting first the odd lines and then the even lines (these are known as fields). This doubles the refresh rate, compared to a progressive scan image at the same frame rate. This works perfectly for video cameras, where each field results from a separate exposure - the effective frame rate doubles, there are now 50 rather than 25 exposures per second. The dynamics of a CRT are ideally suited to this approach, fast scenes will benefit from the 50 Hz refresh, the earlier field will have largely decayed away when the new field is written, and static images will benefit from improved resolution as both fields will be integrated by the eye. Modern CRT-based televisions may be made flicker-free in the form of 100 Hz technology.

Many high-end LCD televisions now have a 120 or 240 Hz (current and former NTSC countries) or 100 or 200 Hz (PAL/SECAM countries) refresh rate. The rate of 120 was chosen as the least common multiple of 24 fps (cinema) and 30 fps (NTSC TV), and allows for less distortion when movies are viewed due to the elimination of telecine (3:2 pulldown). For PAL at 25 fps, 100 or 200 Hz is used as a fractional compromise of the least common multiple of 600 (24 × 25). These higher refresh rates are most effective from a 24p-source video output (e.g. Blu-ray Disc), and/or scenes of fast motion.[5]

Displaying movie content on a TV

As movies are usually filmed at a rate of 24 frames per second, while television sets operate at different rates, some conversion is necessary. Different techniques exist to give the viewer an optimal experience.

The combination of content production, playback device, and display device processing may also give artifacts that are unnecessary. A display device producing a fixed 60 fps rate cannot display a 24 fps movie at an even, judder-free rate. Usually, a 3:2 pulldown is used, giving a slight uneven movement.

While common multisync CRT computer monitors have been capable of running at even multiples of 24 Hz since the early 1990s, recent "120 Hz" LCDs have been produced for the purpose of having smoother, more fluid motion, depending upon the source material, and any subsequent processing done to the signal. In the case of material shot on video, improvements in smoothness just from having a higher refresh rate may be barely noticeable.[6]

In the case of filmed material, as 120 is an even multiple of 24, it is possible to present a 24 fps sequence without judder on a well-designed 120 Hz display (i.e., so-called 5-5 pulldown). If the 120 Hz rate is produced by frame-doubling a 60 fps 3:2 pulldown signal, the uneven motion could still be visible (i.e., so-called 6-4 pulldown).

Additionally, material may be displayed with synthetically created smoothness with the addition of motion interpolation abilities to the display, which has an even larger effect on filmed material.

"50 Hz" TV sets (when fed with "50 Hz" content) usually get a movie that is slightly faster than normal, avoiding any problems with uneven pulldown.

See also

References

This article is based on material taken from the Free On-line Dictionary of Computing prior to 1 November 2008 and incorporated under the "relicensing" terms of the GFDL, version 1.3 or later.

This article is issued from Wikipedia - version of the 12/4/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.