When We Say We Want “Resolution”: DPI and PPI Explained

This is the next post in a series supporting the publication of 36 CFR section 1236 subpart E – Digitizing Permanent Records. All of the posts have been collected under the 36 CFR Section 1236 category.

Photo imagery interpreter SGT Ted Johnson identifies a target as SSGT Doug Lucia plots it during the 1988 Worldwide Reconnaissance Air Meet (RAM ’88). National Archives Identifier: 6448156

In digitization, people often think “resolution” alone equates to image quality. However, the FADGI Glossary defines it more specifically as how well an imaging system can show finely spaced details. With the release of 36 CFR Part 1236 Subpart E – Digitizing Permanent Records, it’s crucial to grasp how resolution relates to overall image quality and the different terms linked to it in the rules. This blog post aims to clarify “dpi” and “ppi”. 

Understanding the Misconception

Many standards for digitization focus on certain file attributes, like resolution, color space, file format, and bit depth to define image quality. However, these attributes alone don’t guarantee high quality; they just serve as specifications for the digital file. The most common misconception is about resolution, which in digitization is usually measured in pixels per inch (ppi). People often think that a higher ppi number guarantees better quality but what truly counts is how well the scanner or camera can capture details in an image through its optical and signal processing capabilities. Image quality can be negatively affected by many factors including poor optics, noise, misregistration, and pixelation.

DPI and PPI Defined

Today, people use dpi and ppi interchangeably, though they are entirely different measures. To eliminate any confusion about resolution, we should understand the distinctions between dpi and ppi. DPI, short for dots per inch, tells us how many dots of ink can be displayed or printed within one inch of space. This term became popular during the early years of digital imaging when image workers estimated digital image specifications to meet a desired printed output. For example, 300 dpi became a commonly adopted digitization specification because it equated to the human eye’s ability to perceive detail in an 8 x 10-inch print held at arm’s length. 

Scanner manufacturers have contributed to the confusion by using the term “dpi” in software settings within scanner control panels. The term “ppi” is used by software like Photoshop to describe the pixel array of a digital file or the number of pixels of a computer monitor display. It’s important to note that simply increasing the sampling rate or size doesn’t necessarily result in better quality if the scanner cannot meet the specifications. Scanners differ in their ability to capture detail. 

36 CFR 1236.46 includes the requirement that agencies confirm the performance of their equipment by scanning an ISO-compliant test target and evaluate the results using analytical software such as OpenDICE. This evaluation analyzes the optical and signal processing of the device. True optical resolution is not determined by the scanner settings, but by how well the device resolves the fine detail of a test target.