VGA – Video Graphics Array

A graphics card, video card, v card, video board, video display board, display adapter, video adapter, or graphics adapter  is a computer component designed to convert the logical representation of visual information into a signal that can be used as input for a display medium. Displays are most often a monitor, but use of LCD TV, HDTVs, and projectors is growing increasingly common with the growth of the media center computer concept. The graphics card and display medium are able to communicate utilizing a variety of display standards. Graphics cards are both integrated into motherboards, and sold as expansion cards.

VGA uses an analog monitor, and PC display adapters output analog signals. All CRTs and most flat panel monitors accept VGA signals, although flat panels may also have a DVI interface for display adapters that output digital signals.

Generic Socket or Specific Resolution
VGA may refer to the 15-pin VGA socket on a PC in general in order to contrast it with a digital DVI socket for flat panels. Or, VGA may refer only to the original VGA resolution of 640×480 and 16 colors. This base resolution is only used when booting the PC in Safe Mode and may also be the maximum resolution for small screens on handheld devices.

VGA History
VGA was introduced on IBM’s PS/2 line in 1987 and quickly obsoleted the earlier CGA and EGA display interfaces, which were actually digital, but with lower resolution. In a short time, non-IBM vendors boosted the base resolution and colors to so-called “Super VGA.” Over the years, resolution has been extended way beyond the original specification.

0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments