Dots per inch

"dpi" redirects here. For pixel density, see Pixel density. For other uses, see DPI.

Monitors do not have dots, but do have pixels. The closely related concept for monitors and images is pixels per inch or PPI.

Old CRT type video displays were almost universally rated in dot pitch, which refers to the spacing between the sub-pixel red, green and blue dots which made up the pixels themselves. Monitor manufacturers used the term "dot trio pitch", the measurement of the distance between the centers of adjacent groups of three dots/rectangles/squares on the CRT screen. Monitors commonly used dot pitches of 0.39, 0.33, 0.32, 0.29, 0.27, 0.25, or 0.22 millimetres [mm] (0.0087 in).

LCD monitors have a trio of sub pixels, which are more easily measured.

DPI measurement in printing

DPI is used to describe the resolution number of dots per inch in a digital print and the printing resolution of a hard copy print dot gain, which is the increase in the size of the halftone dots during printing. This is caused by the spreading of ink on the surface of the media.

Up to a point, printers with higher DPI produce clearer and more detailed output. A printer does not necessarily have a single DPI measurement; it is dependent on print mode, which is usually influenced by driver settings. The range of DPI supported by a printer is most dependent on the print head technology it uses. A dot matrix printer, for example, applies ink via tiny rods striking an ink ribbon, and has a relatively low resolution, typically in the range of 60 to 90 DPI (420 to 280 µm). An inkjet printer sprays ink through tiny nozzles, and is typically capable of 300–720 DPI.[1] A laser printer applies toner through a controlled electrostatic charge, and may be in the range of 600 to 2,400 DPI.

The DP measurement of a printer often needs to be considerably higher than the pixels per inch (PPI) measurement of a video display in order to produce similar-quality output. This is due to the limited range of colors for each dot typically available on a printer. At each dot position, the simplest type of color printer can either print no dot, or print a dot consisting of a fixed volume of ink in each of four color channels (typically CMYK with cyan, magenta, yellow and black ink) or 24 = 16 colors on laser, wax and most inkjet printers, of which only 14 or 15 (or as few as 8 or 9) may be actually discernible depending on the strength of the black component, the strategy used for overlaying and combining it with the other colors, and whether it is in "color" mode.

Higher-end inkjet printers can offer 5, 6 or 7 ink colors giving 32, 64 or 128 possible tones per dot location (and again, it can be that not all combinations will produce a unique result). Contrast this to a standard sRGB monitor where each pixel produces 256 intensities of light in each of three channels (RGB).

While some color printers can produce variable drop volumes at each dot position, and may use additional ink-color channels, the number of colors is still typically less than on a monitor. Most printers must therefore produce additional colors through a halftone or dithering process, and rely on their base resolution being high enough to "fool" the human observer's eye into perceiving a patch of a single smooth color.

The exception to this rule is dye-sublimation printers, which can apply a much more variable amount of dye—close to or exceeding the number of the 256 levels per channel available on a typical monitor—to each "pixel" on the page without dithering, but with other limitations:

These disadvantages mean that, despite their marked superiority in producing good photographic and non-linear diagrammatic output, dye-sublimation printers remain niche products, and devices using higher resolution, lower color depth, and dither patterns remain the norm.

This dithered printing process could require a region of four to six dots (measured across each side) in order to faithfully reproduce the color in a single pixel. An image that is 100 pixels wide may need to be 400 to 600 dots in width in the printed output; if a 100×100-pixel image is to be printed in a one-inch square, the printer must be capable of 400 to 600 dots per inch to reproduce the image. Fittingly, 600 dpi (sometimes 720) is now the typical output resolution of entry-level laser printers and some utility inkjet printers, with 1200/1440 and 2400/2880 being common "high" resolutions. This contrasts with the 300/360 (or 240) dpi of early models, and the approximate 200 dpi of dot-matrix printers and fax machines, which gave faxed and computer-printed documents—especially those that made heavy use of graphics or colored block text—a characteristic "digitized" appearance, because of their coarse, obvious dither patterns, inaccurate colors, loss of clarity in photographs, and jagged ("aliased") edges on some text and line art.

DPI or PPI in digital image files

In printing, DPI (dots per inch) refers to the output resolution of a printer or imagesetter, and PPI (pixels per inch) refers to the input resolution of a photograph or image. DPI refers to the physical dot density of an image when it is reproduced as a real physical entity, for example printed onto paper. A digitally stored image has no inherent physical dimensions, measured in inches or centimeters. Some digital file formats record a DPI value, or more commonly a PPI (pixels per inch) value, which is to be used when printing the image. This number lets the printer or software know the intended size of the image, or in the case of scanned images, the size of the original scanned object. For example, a bitmap image may measure 1,000 × 1,000 pixels, a resolution of 1 megapixel. If it is labeled as 250 PPI, that is an instruction to the printer to print it at a size of 4 × 4 inches. Changing the PPI to 100 in an image editing program would tell the printer to print it at a size of 10×10 inches. However, changing the PPI value would not change the size of the image in pixels which would still be 1,000 × 1,000. An image may also be resampled to change the number of pixels and therefore the size or resolution of the image, but this is quite different from simply setting a new PPI for the file.

For vector images, there is no equivalent of resampling an image when it is resized, and there is no PPI in the file because it is resolution independent (prints equally well at all sizes). However, there is still a target printing size. Some image formats, such as Photoshop format, can contain both bitmap and vector data in the same file. Adjusting the PPI in a Photoshop file will change the intended printing size of the bitmap portion of the data and also change the intended printing size of the vector data to match. This way the vector and bitmap data maintain a consistent size relationship when the target printing size is changed. Text stored as outline fonts in bitmap image formats is handled in the same way. Other formats, such as PDF, are primarily vector formats which can contain images, potentially at a mixture of resolutions. In these formats the target PPI of the bitmaps is adjusted to match when the target print size of the file is changed. This is the converse of how it works in a primarily bitmap format like Photoshop, but has exactly the same result of maintaining the relationship between the vector and bitmap portions of the data.

Computer monitor DPI standards

Since the 1980s, the Microsoft Windows operating system has set the default display "DPI" to 96 PPI, while Apple/Macintosh computers have used a default of 72 PPI.[2] These default specifications arose out of the problems rendering standard fonts in the early display systems of the 1980s, including the IBM-based CGA, EGA, VGA and 8514 displays as well as the Macintosh displays featured in the 128K computer and its successors. The choice of 72 PPI by Macintosh for their displays arose from the convenient fact that the official 72 points per inch mirrored the 72 pixels per inch that appeared on their display screens. (Points are a physical unit of measure in typography, dating from the days of printing presses, where 1 point by the modern definition is 1/72 of the international inch (25.4 mm), which therefore makes 1 point approximately 0.0139 in or 352.8 µm). Thus, the 72 pixels per inch seen on the display had exactly the same physical dimensions as the 72 points per inch later seen on a printout, with 1 pt in printed text equal to 1 px on the display screen. As it is, the Macintosh 128K featured a screen measuring 512 pixels in width by 342 pixels in height, and this corresponded to the width of standard office paper (512 px ÷ 72 px/in ≈ 7.1 in, with a 0.7 in margin down each side when assuming 8.5 in × 11 in North American paper size (in Europe, it's 21cm x 30cm - called "A4". B5 is 176 millimeters x 250 millimeters)).

A consequence of Apple's decision was that the widely used 10-point fonts from the typewriter era had to be allotted 10 display pixels in em height, and 5 display pixels in x-height. This is technically described as 10 pixels per em (PPEm). This made 10-point fonts be rendered crudely and made them difficult to read on the display screen, particularly the lowercase characters. Furthermore, there was the consideration that computer screens are typically viewed (at a desk) at a distance 1/3 or 33% greater than printed materials, causing a mismatch between the perceived sizes seen on the computer screen and those on the printouts.

Microsoft tried to solve both problems with a hack that has had long-term consequences for the understanding of what DPI and PPI mean.[3] Microsoft began writing its software to treat the screen as though it provided a PPI characteristic that is of what the screen actually displayed. Because most screens at the time provided around 72 PPI, Microsoft essentially wrote its software to assume that every screen provides 96 PPI (because ). The short-term gain of this trickery was twofold:

Thus, for example, a 10-point font on a Macintosh (at 72 PPI) was represented with 10 pixels (i.e., 10 PPEm), whereas a 10-point font on a Windows platform (at 96 PPI) at the same zoom level is represented with 13 pixels (i.e., Microsoft rounded 13.3333 to 13 pixels, or 13 PPEm) – and, on a typical consumer grade monitor, would have physically appeared around 15/72 to 16/72 of an inch high instead of 10/72. Likewise, a 12-point font was represented with 12 pixels on a Macintosh, and 16 pixels (or a physical display height of maybe 19/72 of an inch) on a Windows platform at the same zoom, and so on.[4] The negative consequence of this standard is that with 96 PPI displays, there is no longer a 1-to-1 relationship between the font size in pixels and the printout size in points. This difference is accentuated on more recent displays that feature higher pixel densities. This has been less of a problem with the advent of vector graphics and fonts being used in place of bitmap graphics and fonts. Moreover, many Windows software programs have been written since the 1980s which assume that the screen provides 96 PPI. Accordingly, these programs do not display properly at common alternative resolutions such as 72 PPI or 120 PPI. The solution has been to introduce two concepts:[3]

Software programs render images to the virtual screen and then the operating system renders the virtual screen onto the physical screen. With a logical PPI of 96 PPI, older programs can still run properly regardless of the actual physical PPI of the display screen, although they may exhibit some visual distortion thanks to the effective 133.3% pixel zoom level (requiring either that every third pixel be doubled in width/height, or heavy-handed smoothing be employed).

How Microsoft Windows handles DPI scaling

Displays with high pixel densities were not common up to the Windows XP era. High DPI displays became mainstream around the time Windows 8 was released. Display scaling by entering a custom DPI irrespective of the display resolution is a feature of Microsoft Windows since Windows 95. [5] Windows XP introduced the GDI+ library which allows resolution-independent text scaling. [6]

Windows Vista introduced support for programs to declare themselves to the OS that they are high-DPI aware via a manifest file or using an API.[7][8] For programs that do not declare themselves as DPI-aware, Windows Vista supports a compatibility feature called DPI virtualization so system metrics and UI elements are presented to applications as if they are running at 96 DPI and the Desktop Window Manager then scales the resulting application window to match the DPI setting. Windows Vista retains the Windows XP style scaling option which when enabled turns off DPI virtualization for all applications globally. DPI virtualization is a compatibility option as application developers are all expected to update their apps to support high DPI without relying on DPI virtualization.

Windows Vista also introduces Windows Presentation Foundation. WPF .NET applications are vector-based, not pixel-based and are designed to be resolution-independent. Developers using the old GDI API and Windows Forms on .NET Framework runtime need to update their apps to be DPI aware and flag their applications as DPI-aware.

Windows 7 adds the ability to change the DPI by doing only a log off, not a full reboot and makes it a per-user setting. Additionally, Windows 7 reads the monitor DPI from the EDID and automatically sets the system DPI value to match the monitor's physical pixel density, unless the effective resolution is less than 1024 x 768.

In Windows 8, only the DPI scaling percentage is shown in the DPI changing dialog and the display of the raw DPI value has been removed.[9] In Windows 8.1, the global setting to disable DPI virtualization (only use XP-style scaling) is removed and a per-app setting added for the user to disable DPI virtualization from the Compatibility tab.[9] When the DPI scaling setting is set to be higher than 120 PPI (125%), DPI virtualization is enabled for all applications unless the application opts out of it by specifying a DPI aware flag (manifest) as "true" inside the EXE. Windows 8.1 retains a per-application option to disable DPI virtualization of an app.[9] Windows 8.1 also adds the ability for different displays to use independent DPI scaling factors, although it calculates this automatically for each display and turns on DPI virtualization for all monitors at any scaling level.

Windows 10 adds manual control over DPI scaling for individual monitors.

Proposed metrication

There are some ongoing efforts to abandon the DPI Image resolution unit in favor of a metric unit, giving the inter-dot spacing in dots per centimeter ( px/cm or dpcm), as used in CSS3 media queries[10] or micrometres (µm) between dots.[11] A resolution of 72 DPI, for example, equals a resolution of about 28 dpcm or an inter-dot spacing of about 350 µm. In BMP images 2835 pixels per meter correspond to 72 DPI (rounded from 2834.6472).[12]

Conversion table
DPI
(dot/in)
dpcm
  (dot/cm)
Pitch
  (µm)
72 28 350
96 38 265
150 59 169
300 118 85
2540 1000 10
4000 1575 6

See also

References

  1. Ask OKI—"Inkjet Printers"
  2. Hitchcock, Greg (2005-10-08). "Where does 96 DPI come from in Windows?". Microsoft Developer Network Blog. Microsoft. Retrieved 2009-11-07.
  3. 1 2 Hitchcock, Greg (2005-09-08). "Where does 96 DPI come from in Windows?". blogs.msdn.com. Retrieved 2010-05-09.
  4. Connare, Vincent (1998-04-06). "Microsoft Typography – Making TrueType bitmap fonts". Microsoft. Retrieved 2009-11-07.
  5. Where does 96 DPI come from in Windows?
  6. Why text appears different when drawn with GDIPlus versus GDI
  7. "Win32 SetProcessDPIAware Function".
  8. "Windows Vista DPI Settings".
  9. 1 2 3 High DPI Settings in Windows
  10. "Media Queries".
  11. "Class ResolutionSyntax". Sun Microsystems. Retrieved 2007-10-12.
  12. Kateryna Yuri. "Convert dot/meter [dot/m] <> dot/inch [dpi]". TranslatorsCafé.com. Retrieved 2015-01-27.

External links

This article is issued from Wikipedia - version of the 11/10/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.