Any idea why my monitors randomly lose signal? My computer doesn't shut off or anything, its just that both my monitors will randomly cut out and lose signal, the only way to get them to regain signal is to restart my computer. I'm guessing it had something to do with my GPU, it's an old 560 TI so it might just be slowly shitting the bed which is causing the issues. Anyone else have other input?
Monitors Losing Signal
Posted 27 May 2017 - 09:09 AM
Do they both shut off at the same time? How are they connected, DVI?
Posted 30 May 2017 - 06:02 PM
Yeah they both shut off at the same time, they're one is connected from HDMI to DVI and the other is older, it's uses a VGA cable but I use a DVI adapter to connect it.
Posted 30 May 2017 - 06:10 PM
Make sure your video card is properly seated, and if you have different power connectors you can try for it, swap them out.
Posted 30 May 2017 - 08:48 PM
Well I narrowed it down to my GPU overheating and shutting off. I realized one of the fans wasnt working so its temp was way to high.
Now I have a new issue though, I got a EVGA Geforce GTX 1050 TI, and as far as my research shows its compatible with my mobo which is a ASRock B85M Pro4/ASM.
When I try to install the drivers though I get an error saying 'Could not continue, could not find compatible hardware.'
But the GPU is in and running, fans are spinning etc. There's no cables or anything to plug in, you simply just drop it in the PCI port.
Any one have an idea?
Posted 31 May 2017 - 08:01 PM
Well I fixed that problem but now I have ANOTHER PROBLEM! I CANNOT WIN!
On the 1050 ti there is a DVI-D port, HDMI port, and display port. My second monitor is an older VGA only monitor and the adapter I was using on the old GPU was a DVI-A or something, but that doesn't work on the DVI-D port. Also the other kicker is it has to be an active converter (Supposedly, not 100% sure).
How do I know whats an active converter and whats not? Like is a standard 20$ VGA to DVI-D adapter gonna work?
EDIT: Never mind figured that one out too.