Analog vs Digital

Modern computer monitor mockup isolated on white background 3d rendering

In the earliest days of computers, the monitors which showed the images with which we all became fascinated were analog signals. Today, virtually all monitors are digital in nature. But, what is the difference between the two? Many desktop computers have both outputs so which is better to use?

The technical definitions of analog and digital are just that – technical. If I spouted them out, you’d either not understand them or just decide you don’t care. The bottom line is this: digital pictures on a monitor are generally much sharper images than analog and the pixels which make up the image are “painted” onto the screen more rapidly.

A non-computer example of analog vs digital would be a clock. The clock that has hands and moves one second at a time is analog and the clock which simply has a numerical readout is digital.

analog and digital clock

For analog, computers use a VGA (Video Graphics Array) port and cable. Here’s what that looks like:

VGA connector and cable

Whenever you use the VGA connector, you’ll be running your monitor in analog mode.

For digital, a computer can have either a DVI (Digital Video Interface), HDMI (High Definition Media Interface) or DisplayPort. (The differences between these three will be explained in a future blog post.)

picture of three cables - dvi, hdmi and displayport

If you’re buying a new computer, you can pretty much guarantee that there will be a digital output – either HDMI or DisplayPort or both. If you’re using an old monitor, your only choice might be analog – VGA. Most new computers still have the VGA output but not all of them. You might need a digital-to-analog adapter if you want to use an old VGA-only monitor.

Needless to say, the screens used on smartphones, smartwatches, tablets and notebook (laptop) computers are all digital.