High resolution basically means that an image is reproduced with a high level of detail. Usually, it refers to an image that is of very high quality, where there is a lot of detail. When talking about high resolution, most people are usually referring to digital images such as digital photographs, television resolution either on a standard CRT screen or digital TV set, and computer monitor resolution. High resolution is also referenced when talking about digital cameras, scanners, photocopiers, and printers.
Resolution in Standard and High Definition TV Sets
One of the first things people think about when talking about high resolution is the quality of the resolution on standard TV sets or HDTV sets. Standard TVs have the lowest resolution by far. The typical standard TV set usually has a resolution of 520 lines, of which 480 are visible. Besides the amount of lines visible, it receives images in an interlaced pattern, which means that half of the lines of resolution are refreshed thirty times per second. This means that only 15 new images are completely refreshed per second.
High Definition TVs have much higher resolution, usually either 720p or 1080p. The letter P next to the number refers to the fact that the image is progressive. Unlike interlaced images, a progressive image is completely refreshed 30 times per second. Users will receive 30 completely refreshed frames or images each second.
Older monitor screens that are CRT (Cathode Ray Tube) are analog and have a similar resolution quality to standard TV sets. Their resolutions are usually based on pixels or dots on a screen. These dots make up rows of lines and ultimately an image. There are two common types of older analog monitor standards: VGA and SVGA. VGA usually has a pixel rate and resolution of 640 pixels wide by 480 pixels in height. The SVGA monitor can produce pixel resolution of 800 pixels wide by 600 pixels in height. It is important to note that some operating systems allow monitors connected to the computer to increase the resolution. For instance, computer users running Windows XP can enhance their resolution up to 1024 by 768.
Digital computer monitors, such as LCD monitors, are a little trickier to state standards for since many monitors are in various sizes where standards are not available or referred to. However, for most monitors that are a standard aspect ratio 4:3 (the shape of a standard TV set), the resolution is usually 800 by 600, 1024 by 768, 1152 by 864, and 1600 by 1200 (remember that width comes first then height). These numbers represent the total number of pixels and can vary due to one’s personal display settings on the computer’s operating system.
For widescreen computer monitors with an aspect ratio of about 5:4, expect an image resolution of either 1280 by 1024 or 1600 by 1280.
Digital Camera Resolutions
Digital camera resolution is usually measured by mega pixel. 1 Mega pixel (MP) means that there are one million pixels or one million tiny dots that can be one of various colors creating an image. Today, digital cameras that can record digital images of 1 mega pixel (1MP) all the way up to 10 mega pixels (10MP) or more are available. The more mega pixels in an image, the higher its quality resolution. It is important to note that most cameras have a setting where the amount of mega pixels per picture can be chosen. An image with a higher number of mega pixels require more storage space, so many people usually choose a lower mega pixel rate that delivers decent resolution. In addition, many digital cameras can also take pictures in a variety of aspect ratios.
Computer Peripherals (Scanners, Photo Copiers, and Ink Jet Printers)
Usually, computer peripherals such as scanners, photo copiers, and printers have a resolution measured by DPI (dots per inch). When looking for a higher resolution, the higher number usually gives higher quality resolution. Scanners can come in a resolution of 600 by 600 DPI or some scanners go up to and above 3200 by 6400 DPI. The same is true with photo copiers and ink jet printers.