Bitcoin Forum
May 25, 2024, 06:47:20 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Why do I get an out of range error when I use DVI but VGA is fine?  (Read 2470 times)
nemo (OP)
Sr. Member
****
Offline Offline

Activity: 500
Merit: 253


View Profile
July 02, 2011, 06:45:17 AM
 #1

HD 5970 with dual DVI and 1 mini dvi port. If I use a cheap DVI to VGA converter then I can get all ranges shown. If I take off the adaptor and just go straight through DVI then I can't see anything other than an "Out of range" red box in the center of a black screen.

I've tried multiple operating systems, drivers, nothing seems to work. Has anybody ever seen anything like this before? My monitor accepts both DVI and VGA, it's a 22 inch westing house L2610NW.
3phase
Sr. Member
****
Offline Offline

Activity: 313
Merit: 251


Third score


View Profile
July 02, 2011, 07:23:05 AM
 #2

I can only think of a defective DVI cable where maybe one of the pins does not make good contact.

Maybe try another cable?

Fiat no more.
Δoκιμάστε τo http://multibit.org - Bitcoin client τώρα και στα Eλληνικά
Oldminer
Legendary
*
Offline Offline

Activity: 1022
Merit: 1001



View Profile
July 02, 2011, 07:28:04 AM
 #3

Most monitors will auto-detect but just make sure its not set manually to analog instead of digital

If you like my post please feel free to give me some positive rep https://bitcointalk.org/index.php?action=trust;u=18639
Tip me BTC: 1FBmoYijXVizfYk25CpiN8Eds9J6YiRDaX
nemo (OP)
Sr. Member
****
Offline Offline

Activity: 500
Merit: 253


View Profile
July 02, 2011, 08:04:40 AM
 #4

Got it, my refresh rate has to be 60hz and windows keeps trying to make it 62. Hopefully someone has this issue and this thread helps them out.
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!