| <<< Back
Display Adapters From NP Technology Video Display Standards The video standards were established in the early years of the PC, primarily by IBM. In recent years, IBM's fall from dominance has left the video industry without any clear leader to set standards. The Video Electronics Standards Association (VESA) was formed to define new standards and has had some success in creating widely-accepted new standards. Monochrome Display Adapter (MDA) The first video cards used in the earliest machines conformed to the MDA standard, established by IBM as part of the original PC. MDA is a monochrome-only, text-only standard, allowing text display at 80x25 characters. Each character is made up of a matrix that is 9 dots wide by 14 dots high, yielding an effective resolution of 720x350 at a refresh rate of 50 Hz (of course it is text-only so these dots are not individually addressable). Color Graphics Adapter (CGA) The first mainstream video card to support color graphics on the PC was IBM's Color Graphics Adapter (CGA) standard. The CGA supports several different modes; the highest quality text mode is 80x25 characters in 16 colors. Graphics modes range from monochrome at 640x200 (which is worse than the Hercules card) to 16 colors at 160x200. The card refreshes at 60 Hz. The maximum resolution of CGA is actually significantly lower than MDA: 640x200. These dots are accessible individually when in a graphics mode but in text each character was formed from a matrix that is 8x8, instead of the MDA's 9x14, resulting in much poorer text quality. CGA is obsolete, having been replaced by EGA. Enhanced Graphics Adapter (EGA) IBM's next standard after CGA was the Enhanced Graphics Adapter or EGA. This standard offered improved resolutions and more colors than CGA, although the capabilities of EGA are still quite poor compared to modern devices. EGA allowed graphical output up to 16 colors (chosen from a palette of 64) at screen resolutions of 640x350, or 80x25 text with 16 colors, all at a refresh rate of 60 Hz. Video Graphics Adapter (VGA) The replacement for EGA was IBM's last widely-accepted standard: the Video Graphics Array or VGA. VGA, supersets of VGA, and extensions of VGA form today the basis of virtually every video card used in PCs. Introduced in the IBM PS/2 model line, VGA was eventually cloned and copied by many other manufacturers. When IBM fell from dominance in the market, VGA continued on and was eventually extended and adapted in many different ways. True VGA supports 16 colors at 640x480 resolution, or 256 colors at 320x200 resolution (and not 256 colors at 640x480, even though many people think it does). VGA colors are chosen from a palette of 262,144 colors (not 16.7 million) because VGA uses 6 bits to specify each color, instead of the 8 that is the standard today. VGA (and VGA compatibility) is significant in one other way as well: they use output signals that are totally different than those used by older standards. Older displays sent digital signals to the monitor, while VGA (and later) send analog signals. This change was necessary to allow for more color precision. Older monitors that work with EGA and earlier cards use so-called "TTL" (transistor-transistor logic) signaling and will not work with VGA. Some monitors that were produced in the late 80s actually have a toggle switch to allow the selection of either digital or analog inputs. Note that standard VGA does not include any hardware acceleration features: all the work of creating the displayed image is done by the system processor. All acceleration features are extensions beyond standard VGA. Super VGA (SVGA) and Other Standards Beyond VGA After IBM faded from leading the PC world many companies came into the market and created new cards with more resolution and color depths than standard VGA (but almost always, backwards compatible with VGA). Most video cards (and monitors for that matter) today advertise themselves as being Super VGA (SVGA). What does a card saying it is SVGA really mean? Unfortunately, it doesn't mean much of anything. SVGA refers collectively to any and all of a host of resolutions, color modes and poorly-accepted pseudo-standards that have been created to expand on the capabilities of VGA. Therefore, knowing that a card that supports "Super VGA" really tells you nothing at all. In the current world of multiple video standards you have to find out specifically what resolutions, color depths and refresh rates each card supports. You must also make sure that the monitor you are using supports the modes your video card produces; here too "Super VGA compatible" on the monitor doesn't help you. To make matters more confusing, another term is sometimes used: Ultra VGA or UVGA. Like SVGA, this term really means nothing also. :^) Some people like to refer to VGA as 640x480 resolution, SVGA as 800x600, and UVGA as 1024x768. This is overly simplistic however, and really is not something that you can rely upon. IBM did create several new video standards after VGA that expanded on its capabilities. Compared to VGA, these have received very limited acceptance in the market, mainly because they were implemented on cards that used IBM's proprietary Micro Channel Architecture (which received no acceptance in the market). You may hear these acronyms bandied about from time to time:
|