Okay, I got a new computer, and an AOC 2752Vh monitor that has VGA, DVI, and HDMI inputs.
I connected the new monitor via HDMI, and used a DVI -> VGA adapter to run the old 20" monitor.
Everything worked just swell... UNTIL...
I downloaded the software for a "game" I play, called "Second Life". For some reason, it causes the 27" monitor to display "Input Not Support" until the game software is killed, then all returns to normal.
Test B: I disconnected both monitors, and connected the 27" via DVI cable. The game software now works fine. Odd, but something in the software does odd things over HDMI.
Okay, great. Now I have no way to plug in the 20" monitor... EXCEPT...
Besides the AMD Radeon R9 255 card that came pre-installed, there are also AMD Radeon R7 graphics on the motherboard. The motherboard has VGA and HDMI outputs. So I plugged the 20" monitor in to the motherboard. It seems to work fine that way.
I'm waiting for a response from the software people as to why Second Life doesn't work over HDMI. In the meantime, I'm concerned about using two graphics cards for two monitors. I seem to recall that it's a bad idea, but I can't remember why I think that or where I heard it. On top of that, Lenovo installed a rubber cap on the motherboard's VGA output with the international symbol for "NO" on it. I took it off anyway. Does this make me a bad person?