
© iStock.com / Pawel Kajak
Key Points
- DVI and VGA are different cables to connect a monitor. DVI and VGA are considered to be classic connection methods that are outdated.
- VGA is an analog connection, while DVI is flexible technology.
- Computers, servers, and hardware that produce video signals require a method to transmit video information for viewing, editing, or transferring.
Computers, servers, and hardware that produces video signals require a method to transmit video information for viewing, editing, or transferring. A common consumer example is the desktop computer. The tower needs to be connected to a set of peripherals to be of any use.
It needs to be connected to power and user interface devices. The primary requirement is a monitor because what good is a computer if you can’t see what it’s doing. The cable that connects your monitor to your PC is the video display connection cable.
Both DVI and VGA are types of video display connection cables. Most modern desktops and graphics cards will have either one or both of these connections available to use as well as more modern options like HDMI and Display Port. All of these options utilize a system of pin connections to connect the cable to either the monitor or video source.
Also known as a digital video interface, or DVI, is a video controller and interface system that transmits digital or analog video/audio information to peripheral equipment. It is commonly used to connect desktop PCs to external displays.
VGA, also known as a video graphics array, is a video controller that was created in 1987 to simplify video display connections by IBM. VGA uses only analog signals and it originally only supported 480p resolutions but has since been improved to support modern standard resolutions. Most desktop computers come with a built-in video port which uses a VGA connection.
DVI vs. VGA: VGA Explained

©pixelman/Shutterstock.com
VGA stands for Video Graphics Array. It was the successor to an early video connection method called EGA. The connection type was introduced by IBM in 1987 as a built-in component of the IBM PS/2. At the time, there was no true standard for video connections and different products often introduced themselves with a discrete card that could be installed into existing towers. This is because, before VGA, computers used graphics “adapters”.
As an array, VGA was able to integrate more connections and adapters into one array with a little less energy cost. This allowed VGA to support all previous graphics modes supported by MDA, CGA, and EGA cards as well as slightly higher quality. Once IBM released the PS/2 Display Adapter, a VGA card that could be added to existing machines, VGA was quickly adopted by the PC community.
Over time, display technology evolved. By 1999, there was a heavy need for a more capable video controller. VGA was only able to work with resolutions up to 640 x 480. The higher-end monitors of the late 90s were beginning to reach resolutions of 1600 x 1200. VGA couldn’t keep up with the new quality. This led the Digital Display Working Group to develop the Digital Visual Interface.
DVI vs VGA: DVI Explained
DVI was built to use a new digital signal and analog signals as well. This allowed DVI to bridge older technology into the new era. With a DVI connection, computer users were now able to experience full HD resolutions, usually meaning 1600 x 1200 but also 1980 x 1080.
Modern versions of DVI and VGA support full HD resolutions. DVI is still a better option with more connection features and support for higher resolutions and frame rates. However, this doesn’t always mean it is necessary. Both VGA and DVI connections are commonly used wherever a computer might be in place.
DVI vs VGA: Side by Side Comparison
VGA | DVI | |
---|---|---|
What It Is | Video Display Controller | Video Display Interface |
Primary Use | Transmit video from a source to a display device | Transmit video from a source to a display device |
Name | Video Graphics Array | Digital Visual Interface |
Conceived | 1987 | 1999 |
Initial Release | 1987 | 1999 |
Influential Developers | IBM | Digital Display Working Group |
Technologies Influenced | XGA, DVI, HDMI | HDMI, DisplayPort |
DVI vs VGA: The Key Differences Explained
Modern desktop computers and laptops often have a few different methods for connecting to a monitor. DVI and VGA are what can now be considered classic connection methods. They are easy to tell apart as they are physically different sides and color-coded.
The male connection of a VGA cable will have fifteen pins and a blue casing. However, DVI connections will be wider and have either twenty-four or twenty-nine pins, depending on which type of DVI connection or cable. The casing on the male end of a DVI cable is typically grey or white.
VGA is an analog connection, while DVI is a flexible technology with not only a blend of digital and analog signal capability but also features like dual-link. The interface features of DVI allow it to be used for more complex connections like multiple displays, shared monitors, and controls. DVI also supports the higher resolution. With an HDMI-to-DVI cable, a DVI port can be used to pass audio and video. VGA can only send a video signal.
Overall, DVI is a better video controller option. It was made to be a more capable option over VGA. It supports higher refresh rates, resolutions, and more devices.
DVI vs VGA: Important Facts

There are a number of important facts that you should know about DVI and VGA. We’ll go into them in more detail below.
1. VGA Has Been improved
While the legacy version of VGA could only manage to get up to 480p resolutions, modern VGA is much more capable. Over the years, businesses and individuals have proven to computer developers and manufacturers that they are slow to adopt new technology.
This meant that many computer users held on to older connections to keep using monitors and peripherals that were made for their older hardware. This is often due to the rarity and cost of the equipment they use. Sometimes It’s just because they didn’t want to purchase a new monitor, ever.
Whatever the case, the easier solution was to upgrade the VGA system itself. Modern VGA can support resolutions up to 2048 x 1536. In some cases, the software can leverage VGA connections to support ever-higher resolutions.
2. DVI Ports Have Three Variants
- DVI-D: The “D” stands for digital format connector. These come in both single-link and dual-link. DVI-D is the most common connector for LCD displays. Large screens with higher resolutions make better use of the stronger signal from a dual-link DVI-D.
- DVI-A: “A” is for analog format connector. This type was meant to keep computers compatible with CRT displays while technology evolved. It is still used for backward compatibility.
- DVI-I: DVI-I is the integrated format connector. It can use either digital or analog equipment. This is the most compatible cable for staying modern and connecting to legacy CRTs. While it can’t push a better digital output into an analog device, it can still display on it. DVI-I is also available in both single-link and dual-link
- Single-link: Single-link is the first iteration of the DVI. It is fine for most 1080p connections and standard office use. However, it has trouble with higher resolutions and large monitors.
- Dual-link: Dual-link cables can push more power and twice the bandwidth. This makes them far more capable for 4k resolutions and cable splitting.
3. DVI Refresh Rates
Gamers, video producers, and editors are likely to seek the highest fidelity video they can. This means they’ll want the highest refresh rate possible. Resolution is important, but resolution only denotes the clarity by pixel density. A higher resolution is a better experience. However, the refresh rate is just as important.
In some cases, gamers or video producers may prefer to have a higher refresh rate like 120 to 144hz at a standard HD resolution (1920 x 1080) over a 4k resolution (3840 x 2160) at a 60hz refresh rate. Refresh rate denotes how fast the screen ‘redraws’ the image on the screen. It’s easier to imagine as fps, or frames per second, even if they are a bit different.
Running a monitor at a 60hz refresh rate means the image is redrawn sixty times per second. Of course, that means that running at a 120hz refresh rate would double how many times the screen is drawn. This allows for more fluid animation and on-screen changes.
Gamers get an advantage from this by being able to see changes in multi-player games before competitors. In laymen’s terms, a player will see their opponents turn around a corner or move before competitors with lower refresh rates. Ultimately, modern gaming desktops will also have HDMI 2.0 and DisplayPort connections. Digital connections like DisplayPort were designed to offer even higher resolutions at higher refresh rates.
For video producers, this allows for greater control over video production by making frame rate settings more apparent. The digital age of recording and editing has allowed producers to make more varied decisions about film production like straying from the standard 30 fps used for television without piling up on massive cost.