Finding a use case for DVI over VGA the hard way
I finally found a good reason for DVI, as opposed to a plain old VGA connector. I don't mean the technoweenie explanation with arbitrary numbers and other gunk flying around. I'm talking about the kind of explanation which can reach normal people -- the ones who haven't been forced to learn all of this geek stuff. It's a difference you can see if you just pay attention.
I've been using a pair of flat panel monitors for the past five years. They've been hooked to my Linux box at home for dev work. Earlier this week, one of them decided to die. Rather than trying to balance across two different small screens with different visual characteristics, I decided to buy a larger single replacement.
I found a nice looking monitor which had the usual HDMI and DVI inputs, but it also had a VGA connector. I was happy, since my dev box isn't fancy and just had a pair of VGA outputs. Now I wouldn't need another video card! Well, that was the plan, at least. After hooking it up, reality played out a different way. It had these weird periodic spots where a chunk would be focused perfectly, and then you'd move horizontally and it would start smearing, like this huge macro shot of the word "curl" seen here.
No amount of fiddling with this monitor's "clock" adjustment would fix it completely. I managed to get it to not seem inconsistent from one side to the other, but that didn't really work, either. A few hours later I noticed things seemed odd, and indeed, upon closer inspection, the whole thing was now slightly blurry. I had eliminated the focus inconsistencies by smearing it all out with the VGA clock adjustment. Brilliant.
I had to dive into the mess that is computer hardware. First, what do I have now? Next, do they make a DVI version of it? Oh, there's something called dual port DVI. But that's not the same as having two ports, like my old card had. It's just more bandwidth in the same connector. Okay. Maybe I need that. Nope, actually, I don't. Only mega monitors like Apple makes call for that, and I don't have one of those.
I went around and around with this stuff. After far too much forced learning it started making sense: the machine had this kind of AGP slot and would handle these voltages, and I needed a card with at least a clock speed of X, preferably one that advertised the exact resolution I needed. So many variables to solve for simultaneously. What a hassle.
Fortunately, this story does seem to have a happy ending. After dropping in a suitable replacement card, things got better. This second picture shows how it can render those skinny lines without slopping over into any other pixels. Any blurring is probably from my hand-held camera work in macro mode, since the screen itself looks beautiful now.
At some point, you just have to let that 15 pin connector go.