Picture sharpness can be summed up by being defined as the fine details found in an image. Images can be analog, such as a photograph or on a standard TV screen. They can also be digital made out of pixels or small bits of information as in HDTV (high definition TV) or digital photo images. When a picture has lots of fine details, it is said to have good contrast. Contrast is an integral part of picture sharpness, it can be said it is the differentiating information within an image that can help you see the fine details.
Contrast or Contrast ratio, as it is sometimes referred to is the amount of brightness or luminance between two colors, one of these colors is always black, the other colors are usually either red blue or green. Contrast as stated above plays an enormous part in how the viewer perceives picture quality. When a picture is said to have a high contrast ratio, the individual can easily see the sharp colors of blue, red or green against any black color.
However it is important to note that by setting the contrast ratio too high, for instance on a computer monitor or TV set, although you are differentiating the colors of red, blue and green to any black colors, you run the risk of misadjustment. Misadjustment means that an image has either too high or too low contrast ratio. It will make it difficult for the viewer to look at the image for long periods of time and will not be perceived by the viewer as being a high quality image.
Misadjustment is when an image has either too high a contrast or too low a contrast. When an image has too high a contrast, the input signal (image on a TV set) can not reproduce true black effectively, and instead you are left with shades of gray, which ultimately causes loss in picture sharpness.
When an image has the contrast resolution set too low, the lighter colors such as blue, red and green are not able to be reproduced effectively, causing them to be swallowed or crushed. The image is poor quality and looks dark and very gloomy.
Picture Sharpness and Reproduction
With analog images, picture sharpness can be lost quite quickly. For instance when it is shown on a TV set that does not receive a full or clear signal or when it is recorded onto magnetic media such as tape recorders or VCR's. Picture sharpness will drop immediately.
Digital images are much hardier, while bits can be lost in the recording or when they are stored, many computers and advanced video or audio equipment can compensate for lost or corrupted bits of information. With most digital media such as DVD's, you can record the original data 100.000 times over and not usually see any diminished picture sharpness. The original data stays intact or corrupted data is compensated for so that picture sharpness is not noticeably different between the original and copied data by the naked eye. Unfortunately, you can't say that about analog data. After 100,000 copies, the picture sharpness would most likely look grossly inferior.