For images viewed on computer screens, scan resolution merely determines image size. The bottom line is that dpi or ppi means pixels per inch, which means that if you scan 6 inches at 100 dpi (or 1 inch at 600 dpi), you will create 600 pixels, which will display on any screen as 600 pixels in size.
We think of greater resolution as showing more detail, and while that's generally true (within reasonable limits), it's because it makes the image larger. But we are always greatly limited by our output device, and often cannot take advantage of maximum resolution. The images are huge, and our screens are simply not large enough.
If you don't know your screen size, then the Windows - Start - Settings - Control Panel - Display icon - Settings tab will show or change it. Or on the Macintosh, at the Apple Monitor Control Panel. Screen size settings for LCD monitors should match the actual hardware native size.
Popular CRT video screen size settings historically have been:
640x480 or 800x600 pixels for 14 inch monitors
800x600 or 1024x768 pixels for 15 inch monitors
1024x768 or 1152x864 pixels for 17 inch monitors
1152x864 or 1280x1024 pixels for 19 inch monitors
Newer LCD monitors are wide screen, and commonly up to 1920x1080 pixels now. Larger pixel heights typically make the text appear smaller, but we can specify larger text size in Windows (which does not affect image sizes).
Times have changed after much of this was written. But explanations are not so simple now.
The browsers specify a Pixel Ratio for the browser text size enlargement. So in both browsers, the initial showing enlarges the whole page, both text and today, also images. Zooming in shows the screen larger than the hardware size, but the part we see is enlarged.
Chrome reports the "screen size" to include the browser zoomed screen zoom factor, but shows the zoomed size to be the actual hardware size (that could just be my interpretation of it, of which I am unsure). I can see the argument that it describes that we can only see the hardware part of it.
Firefox shows it the way I interpret it the words, reversed from Chrome.
On this page (which sets viewport to device settings), then iPhone always reports screen size to be viewport size, which is fixed, independent of actual window size. When the pixel ratio is 2, that means image pixel size is doubled, which we see on the hardware screen (using 4 pixels per image pixel, so to speak), which is what zoom does. However, web page "meta viewport" size can tell the mobile browser other starting instructions.
But manually zooming any browser affects the whole page. Browsers do zoom the whole page, including the images, when we zoom the page away from 100% zoom size, but (except for Firefox) NOT when we only set a larger text size in Windows, which only applies to text.
So on video monitors, we don't all see the same size of things. Screens are different sizes, in inches and in pixels (resolution), and any browser zooming does affect the whole page.
Browsers and devices do work differently, but Reported Screen size is affected by current operating system and browser zoom settings. For Desktop browsers (which vary), zooming larger (including Windows Text Size zoom) reduces virtual screen size, which when enlarged to device screen size is the zoom enlargement.
For a size comparison, the red image below is exactly 500 pixels wide (IF your browser window width is at least 500 pixels). The image width is reduced to fit in window widths less than 500 pixels, and the screen calculation will show that width. So the white box just above about the screen size should know about the size change, however the yellow text within the image will not know about the resize. 😊
Our monitor screens show pixels directly. Images are dimensioned in pixels, and screens are dimensioned in pixels, and video systems show pixels directly, one for one. Bottom line, if we have a 500x200 pixel image, scaled to 100 dpi to print 5x2 inches on paper, on the screen it will still simply show as 500x200 pixels. Zoom certainly is a complication (showing different New resampled pixels then, but 100% Actual size is very obvious. The point is, it's very important, because it's how it works, and it's good to know that.
You do need to realize that each of us will see this 500 pixel image width at a different apparent size on our different monitor screens. We don't necessarily see the same thing. On a desktop video screen, we may see the 500 pixel image anywhere between 4 and 8 inches wide, depending on our screen size and settings.
But different browsers can show it differently too.
Most desktop monitors will be in this general ballpark (almost 6 inches), but a 15 inch monitor set to 640x480 pixels may see it at 8 inches wide, or more. A laptop with 1600x1200 pixel display may see it at 4 inches wide, or less. So that's a very wide size range for the same image on different screens, and it shows that there is no concept of exact size in inches on the screen. We don't all see the same size on the screen. Screens show pixels directly as a dot of color, and their size depends on monitor size and resolution. If the screen is set to show 1680x1050 pixels, then it WILL show 1680x1050 pixels.
So it will be quite necessary to forget about inches on the screen, because video only shows pixels, and screens differ in size. The only one correct answer possible about video image size is that every screen will always see 500 pixels as exactly 500 pixels (assuming we view images at 100% Actual size, otherwise it is about other different pixels). Images are dimensioned in pixels, and screens are dimensioned in pixels, and these 500 pixels will fill 500 pixels on any screen, but those same 500 pixels will fill a different area in inches on different size screens.
Inches simply have no meaning on the computer screen, we all see something different. Inches are not defined in the video system. There is no concept of dpi in the video system either. The way video works is that when you set your video settings to say 1024x768 pixels, then that 1024x768 pixels of memory on your video board defines your video system. The programs you use will copy your pixels directly into that 1024x768 pixel video memory. One image pixel goes into one video board pixel memory location, one for one. A 500x200 pixel image fills 500x200 of those 1024x768 pixels. Those 1024x768 pixels are output to your screen, regardless of the size of the glass tube attached. Video is only about those 1024x768 pixels (or whatever the current setting is).
Unfortunately, we frequently hear how 72 dpi or 96 dpi images are somehow important for the video screen, but that's totally wrong. Video simply doesn't work that way. Video systems have no concept of inches or dpi. No matter what dpi value may be stored in your image file (like 300 dpi for printing), your video system totally ignores it, and always just shows the pixels directly. The truth of this should be clearly apparent if you simply watch what it does.
The digital screen width in pixels ÷ by screen width in inches is the dpi resolution of the screen.
DCI and SMPTE are cinema specifications. ATSC is North American digital television specifications (digital equivalent of NTSC that was for CRT). CS for Cinemascope is one I made up for this.
Digital TV video screen sizes | |||
---|---|---|---|
ATSC HDTV | 1280x720 pixels | 16:9 | 720p HD |
ATSC HDTV | 1920x1080 pixels | 16:9 | 1080i Full size HD |
wide | 1366x768 pixels | 16:9 | Early wide screens |
DCI 2K | 2048x1080 pixels | 1.90:1 | ~17:9 |
DCI 2K CS | 2048x858 pixels | 2.39:1 | Cinemascope crop |
SMTPE UHD | 3840x2160 pixels | 16:9 | Ultra HD * |
DCI 4K | 4096x2160 pixels | 1.90:1 | 2160p ~17:9 |
DCI 4K CS | 4096x1716 pixels | 2.39:1 | Cinemascope crop |
SMPTE 8K | 7680x4320 pixels | 16:9 |
* UHD (Ultra HD) is commonly called 4K, but which are technically different formats, except those two are in fact fairly close in size (about 6.7% difference in the dimensions of 4096 vs. 3840 pixels). The actual binary 4K number may be 4096, but UHD is what is available (in some markets). The DCI 2K and 4K aspect ratios are 1.896:1 (256:135, approximately 17.06:9. If 4096x2304, it would be 16:9). But in the 4K market, UHD is the consumer standard, at 16:9 aspect ratio (1.778:1). UHD dimensions are 2x that of 1920x1080 (4x more total pixels), which maintains the same 16:9 aspect shape. Some streaming and satellite channels and some Blu-Ray DVD do offer some UHD media (which is often called 4K), but no broadcast or cable TV channels offer 4K or UHD. Wikipedia has a list of resolutions used by USA broadcast and cable channels. All broadcast stations are 1280x720 or 1920x1080 (USA Broadcast networks: ABC and Fox are 720p, and CBS, NBC and PBS are 1080i. Cable channels vary, between one of these). 4K TVs are UHD, and might be called 4K UHD, and are spec'd as 2160p.
These computed screen dpi values are merely calculations after the fact — calculating the apparent size of the existing image pixels. The same image will be shown on any connected monitor. We can say this calculation is the apparent resolution of the result on the one specific screen, but the video system does NOT know about those dpi numbers, and did not use any such dpi or inch numbers to show the image on the screen. The numbers that are important for showing images on our screens are the screen size numbers in the first column. If we have a 800x600 pixel screen, that is what we see, and then we know exactly how a 400x300 pixel image will fit on it.
That screen size in pixels is called the resolution of video systems, and it is selected and named "Screen resolution" in the Windows Control Panel ("Display"). Today, LED monitors are digital and can specify their actual native default, which should be retained and used. This specified size image is created by the OS in your video card memory. Then this specified video card image is shown on your monitor, regardless of any monitor details (and if LED, it's good if the two grids match). We needed a size combination producing about 70 dpi for the CRT screen to look decent, and 80 dpi was better. Today many LED cases are likely about 100 dpi. This chart shows why the larger screen sizes like 1600x1200 pixels were called "high resolution". More pixels (if actually existing) in the same screen area are smaller pixels, with more potential for detail on the overall screen. This means that we could show a larger image then, more pixels with more detail.
When we show a too-big image (larger than our viewing screen or window, everything is dimensioned in pixels), our viewing software normally instead shows us a temporary quickly resampled copy, small enough to be able to fit on the screen so we can see it, for example, perhaps maybe 1/4 actual size (this fraction is normally indicated to us in photo editors, so we know it is not the full size real data). We still see the pixels of that smaller image presented directly, one for one, on the screen, which is the only way video can show images. When we edit it, we change the corresponding pixels in the original large image data, but we still see a new smaller resampled copy of those changes. But then the screen is still showing us that resampled copy at this apparent computed resolution. Video simply shows pixels "one for one" on the screen.
About what we see:
Your photo editor program normally automatically resamples a too-large image to be smaller so it will fit into the program's window size. Then we only see the smaller copy on the screen, new different pixels, not the original pixels. It also provides a View or Zoom menu, we can create the copy any size we wish, without affecting the original data. The window title bar will show the size reduction ratio, as warning that this screen copy is not the real image data. For example, the title bar might indicate we are viewing a copy at 33% size, or it might say 1:3 ratio of real size (1:3 is also 33% size). We only see the actual original pixels when it says 100% or 1:1 size.
A second situation is Page Layout programs (like MS Word, Publisher, Acrobat, InDesign, PageMaker, Quark). These handle images differently than a photo editor. The very least a photo editor can create is a one-pixel image, its purpose is to create images. But these page layout programs have the one purpose to design and print paper documents. The very least that a page layout program can create is a blank page, which at minimum specifies a paper page dimension in inches. This is very different, it is totally about that page of paper (but people get confused about this). We add text and images to that document to fill inches of printed paper. Page layout programs necessarily do show our document on the video screen, but what we see is an image replica of that page of paper. It may have other embedded images filling areas on that paper page, which are resampled very small to fit their allotted space in the image of the full page we see on the screen. Again, we have a Zoom menu to show that image of the page any size we wish.
In both of these cases, we only see the smaller image copy on the screen (different new pixels), but we print the larger image data using the original pixels. Both cases provide a View or Zoom menu, we can show the images at any size we wish on the screen without affecting the original data. The point is that these are not exceptions, because the video system shows these new resampled pixels in the only way it can, directly, one for one. However, the new image size in pixels is not normally specified to us. Every dimension number we see still pertains to the original size data, (or the size on printed paper), but that is not necessarily about the image pixels we see on the screen (unless we view 100% Actual size).
It also means that when you want to evaluate your image critically, be sure to view the image at 100% Actual size (even if you must scroll around on it), so you are seeing the genuine image pixels that will print, and not a rough resampled temporary copy.
The screen is typically larger than our photographs, so enlargement is often used to show a snapshot photo. We often scan at higher resolutions to fill more of the screen. When we increase scan resolution, we get more pixels, so it increases the image size. But a little goes a long way, and there's no advantage in wrestling with overly huge images just to discard most of the pixels when we display them. So don't scan at 300 dpi or 600 dpi when there's no purpose for it.
How to do that?
Continued