Bitcoin Forum

Bitcoin => Mining support => Topic started by: nemo on July 02, 2011, 06:45:17 AM



Title: Why do I get an out of range error when I use DVI but VGA is fine?
Post by: nemo on July 02, 2011, 06:45:17 AM
HD 5970 with dual DVI and 1 mini dvi port. If I use a cheap DVI to VGA converter then I can get all ranges shown. If I take off the adaptor and just go straight through DVI then I can't see anything other than an "Out of range" red box in the center of a black screen.

I've tried multiple operating systems, drivers, nothing seems to work. Has anybody ever seen anything like this before? My monitor accepts both DVI and VGA, it's a 22 inch westing house L2610NW.


Title: Re: Why do I get an out of range error when I use DVI but VGA is fine?
Post by: 3phase on July 02, 2011, 07:23:05 AM
I can only think of a defective DVI cable where maybe one of the pins does not make good contact.

Maybe try another cable?


Title: Re: Why do I get an out of range error when I use DVI but VGA is fine?
Post by: Oldminer on July 02, 2011, 07:28:04 AM
Most monitors will auto-detect but just make sure its not set manually to analog instead of digital


Title: Re: Why do I get an out of range error when I use DVI but VGA is fine?
Post by: nemo on July 02, 2011, 08:04:40 AM
Got it, my refresh rate has to be 60hz and windows keeps trying to make it 62. Hopefully someone has this issue and this thread helps them out.