.: Monitors :.

Resolution

Imagine lying down in the grass with your nose pressed deep into the thatch. Your field of vision would not be very large, and all you would see are a few big blades of grass, some grains of dirt, and maybe an ant or two. This is a 14-inch 640 x 480 monitor. Now, get up on your hands and knees, and your field of vision will improve considerably: you'll see a lot more grass. This is a 15-inch 800 x 640 monitor. For a 1280 x 1024 perspective (on a 19-inch monitor), stand up and look at the ground. Some monitors can handle higher resolutions such as 1600 x 1200 or even 1920 x 1440—somewhat akin to a view from up in a tree.

Monitors are measured in inches, diagonally from side to side (on the screen). However, there can be a big difference between that measurement and the actual viewable area. A 14-inch monitor only has a 13.2-inch viewable area, a 15-inch sees only 13.8 inches, and a 20-inch will give you 18.8 inches (viewing 85.7% more than a 15-inch screen).

A computer monitor is made of pixels (short for "picture element"). Monitor resolution is measured in pixels, width by height. 640 x 480 resolution means that the screen is 640 pixels wide by 480 tall, an aspect ratio of 4:3. With the exception of one resolution combination (1280 x 1024 uses a ratio of 5:4), all aspect ratios are the same.

From The PC Guide, by Charles M. Kozierok:

A pixel is the smallest element of a video image, but not the smallest element of a monitor's screen. Since each pixel must be made up of three separate colors, there are smaller red, green, and blue dots on the screen that make up the image. The term dot is used to refer to these small elements that make up the displayed image on the screen. In order to use different resolutions on a monitor, the monitor must be able to support automatic changing of resolution modes. Originally, monitors were fixed at a particular resolution, but most monitors today are capable of changing their displayed resolution under software control. This allows for higher or lower resolution depending on the needs of the application. A higher resolution display shows more on the screen at one time, and the maximum resolution that a monitor can display is limited by the size of the monitor and the characteristics of the CRT (cathode-ray tube). In addition, the monitor must have sufficient input bandwidth to allow for refresh of the screen, which becomes more difficult at higher resolutions because there is so much more information being sent to the monitor.

You can see by the chart below how screen size and effective resolution are linked. Compare a 15-inch monitor and a 21-inch monitor, both set to 800 x 600 pixels: the 15-inch will have a higher resolution. Larger monitors must contain smaller pixels in order to maintain the same resolution, but when a smaller monitor is set to a high resolution, the images would be much too small to read. A 14-inch monitor set to 640 x 480 is very readable, while a 21-inch needs at least 1024 x 768. Here are some recommended resolutions for the different screen sizes:

  14" 15" 17" 19" 21"
640x480 BEST GOOD TOO BIG HUGE TERRIBLE
800x600 GOOD BEST GOOD TOO BIG HUGE
1024x768 TOO SMALL GOOD BEST GOOD STILL GOOD
1280x1024 TINY TOO SMALL GOOD BEST GOOD
1600x1200 TERRIBLE TINY TOO SMALL GOOD BEST
TheScreamOnline is optimized for viewing at 1024 x 768 resolution.
As you can see by the chart above, it should look good on most monitors.

Be aware that there are many versions and interpretations of these settings.
This table is an average of various opinions.

Adjusting Resolution

On a PC with Windows, do the following:

1. Double-click the Display Icon in the Control Panel by clicking: Start > Settings > Control Panel.
2. Select the "Settings" tab in the Display Properties Dialog Box.
3. Adjust the slider to 800 x 600 (shown below), then click the Test Button. A test bitmap will appear for 5 seconds, then you will be asked if everything looked OK. Click YES to confirm.

On a Mac, go to Control Panels > Monitors and you will see a list of settings. It couldn't be easier.

Color

From The PC Guide, by Charles M. Kozierok

There are 4 standard color depths used by monitors: 4-bit (Standard VGA), 8-bit (256-Color Mode), 16-bit (High Color),and 24-bit (True Color). Each pixel of the screen image is displayed using a combination of three different color signals: red, green, and blue. The appearance of each pixel is controlled by the intensity of these three beams of light. When all are set to the highest level the result is white; when all are set to zero the pixel is black. The amount of information that is stored about a pixel determines its color depth, which controls how precisely the pixel's color can be specified. This is also sometimes called the bit depth, because the precision of color depth is specified in bits. The more bits that are used per pixel, the finer the color detail of the image. However, increased color depths also require significantly more memory for storage of the image, and also more data for the video card to process, which reduces the possible maximum refresh rate.

[Computers use a binary language of two numbers, "one" and" zero," signifying "on" and "off." Bit depth is the number of bits in each pixel. Color depth is the maximum number of colors in an image and is based on the bit depth of the image and of the displaying monitor. A black and white monitor uses 1-bit color depth (2 to the power of 1): black=light off, and white= light on. Each pixel has a bit depth of one and a color depth of two. One bit produces two possible colors. Color monitors use at least 2-bit color, or 2-to-the-2nd power (2x2=4), meaning that 4 shades of color are available for each of the three primary colors (red, blue, and green). 4-bit color (2x2x2x2=16) means that each of the primaries has 16 shades; the greater the bit depth, the more shades for each color. See the chart below for a comparison of bit depth and color resolution. —Ed.]

256-Color Mode: uses only 8 bits (2 bits for blue, 3 for green, 3 for red). Choosing between only 4 or 8 different values for each color would result in poor blocky color, so a different approach is taken instead: the use of a palette. A palette is created containing 256 different colors. Each one is defined using the standard 3-byte color definition that is used in true color: 256 possible intensities for each of red, blue, and green. Each pixel is allowed to choose one of the 256 colors in the palette, which can be considered a "color number" of sorts. So the full range of color can be used in each image, but each image can only use 256 of the available 16 million different colors. When each pixel is displayed, the video card looks up the real RGB values in the palette based on the "color number" the pixel is assigned.

The palette approach is an excellent compromise: it allows only 8 bits to be used to specify each color in an image, but allows the creator of the image to decide what the 256 colors in the image should be. Since virtually no images contain an even distribution of colors, this allows for more precision in an image by using more colors than would be possible by assigning each pixel a 2-bit value for blue and 3-bit values each for green and red. For example, an image of the sky with clouds would have many different shades of blue, white, and gray, and virtually no reds, greens, or yellows.

256-color is the standard for much of computing, mainly because the higher-precision color modes require more resources (especially video memory) and aren't supported by many PC's. Despite the ability to "hand pick" the 256 colors, this mode produces noticeably worse image quality than high color, and most people can tell the difference between high color and 256-color mode.

High color: 16-bit color—uses two bytes of information to store the intensity values for the three colors. This is done by breaking the 16 bits into 5 bits for blue, 5 bits for red, and 6 bits for green, giving 32 different intensities for blue, 32 for red, and 64 for green. This reduced color precision results in a slight loss of visible image quality, but it is actually very slight—most people cannot see the differences between true color and high color images unless they are looking for them. For this reason high color is often used instead of true color—it requires 33% (or 50% in some cases) less video memory, and it is also faster for the same reason.

True color: 24-bit color—three bytes of information are used, one for each of the red, blue, and green signals that make up each pixel. Since a byte has 256 different values, each color can have 256 different intensities, using over 16 million different color possibilities, and allowing for a very realistic representation of the color of images, with no compromises necessary and no restrictions on the number of colors an image can contain. In fact, 16 million colors is more than the human eye can discern, though true color is necessary for doing high-quality photo editing, graphic design, etc. [Some video cards have to use 32 bits of memory for each pixel when operating in true color, due to how they use the video memory.]

[TheScreamOnline is best viewed with 24-bit color, or millions of colors, though thousands of colors will suffice. On a Mac, go to Control Panels > Monitors (see graphic above under Adjusting Resolution) and set the color depth to thousands or millions of colors if your video card supports it. Lowering the resolution of your screen display may allow you to achieve a greater color depth. On a Windows computer, go to Start Menu > Settings > Control Panels > Display > Settings. —Ed.]

BIT DEPTH COLOR RESOLUTION CALCULATION
1-bit 2 colors 2 (2)
2-bit 4 colors 2 (2x2)
3-bit 8 colors 2 (2x2x2)
4-bit 16 colors 2 (2x2x2x2)
5-bit 32 colors 2 (2x2x2x2x2)
6-bit 64 colors 2 (2x2x2x2x2x2)
7-bit 128 colors 2 (2x2x2x2x2x2x2)
8-bit 256 colors 2 (2x2x2x2x2x2x2x2)
16-bit 65,536 colors 2
24-bit 16,777,215 colors 2

 

Monitors vs. Browsers

Four variables tend to make the life of a web designer a living hell. Macintosh monitors display text at 72 dpi (dots-per-inch) and PC's take 96 pixels to show that same text. Translation: a PC monitor enlarges the type—sort of like reading a large-print novel. Differences in the two major browsers (Netscape and Internet Explorer) also add to the problem. Netscape is close to WYSIWYG (What You See Is What You Get)—it doesn't significantly change how a webpage is meant to appear. Internet Explorer (or IE), on the other hand, enlarges text one-to-two point sizes. So, a page of text can appear four different ways, depending on the combination: Netscape on a Mac, IE on a Mac, Netscape on a PC, and IE on a PC. The difference between viewing a page on a Mac using the Netscape browser and that same page on a PC with IE is enormous.

A lot of seemingly unnecessary time is spent by web designers trying to "dumb-down" a site so that it looks acceptable in all formats. Many designers, however, say "To hell with those with cheap equipment," and create a site for users with high-end gear. If only the two platforms and browsers could conform to a standard, then most of these woes wouldn't exist and not only could designers focus more on the quality of the design itself, the product would be better and the end-user (you!) would consistently see websites as they were meant to be viewed.

Caveat Emptor

Many of the low-price "deals" that come with PC packages can include—in addition to the requisite monitor, CPU, and keyboard—a modem, scanner, Zip drive, printer, and CD-rom drive. Be careful—you get what you pay for. Much of the time you will end up with a very inexpensive 8-bit 640 x 480 monitor that cannot be adjusted. While a good monitor (large size with a high resolution/bandwidth/refresh rate and small dot pitch) will hold its value for some time, CPU's will be worth only a fraction of their original cost after about a year. If you just use a computer for basic word-processing and email communications, then the cheap route is probably adequate. Beyond that, you will quickly realize the limitations of your purchase. If you are serious about creating, or at least viewing, high-quality images, in addition to viewing websites as they are meant to be seen, then it would be wise to invest in the appropriate equipment.

Many thanks to Charles M. Kozierok, author of the The PC Guide (http://www.PCGuide.com).

TOP

• Back to the TECH PAGE