Bitcoin Forum
May 14, 2024, 12:23:29 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: 1 2 3 4 5 6 [All]
  Print  
Author Topic: WHy do people only buy ATIcards when NVIDA is somuch better?  (Read 11237 times)
astana (OP)
Member
**
Offline Offline

Activity: 98
Merit: 10


View Profile
October 10, 2011, 02:36:35 AM
 #1

I don't get this, is it price? I thought people would be investing in NVIDIA 580GTX's rather than ATI.

What gives?
1715689409
Hero Member
*
Offline Offline

Posts: 1715689409

View Profile Personal Message (Offline)

Ignore
1715689409
Reply with quote  #2

1715689409
Report to moderator
1715689409
Hero Member
*
Offline Offline

Posts: 1715689409

View Profile Personal Message (Offline)

Ignore
1715689409
Reply with quote  #2

1715689409
Report to moderator
"This isn't the kind of software where we can leave so many unresolved bugs that we need a tracker for them." -- Satoshi
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 10, 2011, 02:43:02 AM
 #2

I don't get this, is it price? I thought people would be investing in NVIDIA 580GTX's rather than ATI.

What gives?

ATI is better for price/performance. Especially the lower end cards.

ATI = Bitcoin mining & gaming.
Nvidia = F@H & Gaming.

In all favor from me, if I had the money, it be hands down to nvidia for me buying a gfx card strictly for gaming & maybe some F@H when not gaming.
astana (OP)
Member
**
Offline Offline

Activity: 98
Merit: 10


View Profile
October 10, 2011, 02:46:39 AM
 #3

But wouldn't buying 2 3GB 580 GTX's pay off sooner than a ATI?
bulanula
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 10, 2011, 02:48:38 AM
 #4

But wouldn't buying 2 3GB 580 GTX's pay off sooner than a ATI?

Nope. Nvidia sucks balls for mining son. Around 4 times slower than ATI shit.
nmat
Hero Member
*****
Offline Offline

Activity: 602
Merit: 501


View Profile
October 10, 2011, 02:55:30 AM
 #5

But wouldn't buying 2 3GB 580 GTX's pay off sooner than a ATI?

Just compare the values yourself: https://en.bitcoin.it/wiki/Mining_hardware_comparison

Being better depends on what you are doing. If we were using bcrypt instead of SHA the CPU would be better than any GPU.
NetTecture
Full Member
***
Offline Offline

Activity: 140
Merit: 100


View Profile
October 10, 2011, 04:19:08 AM
 #6

I don't get this, is it price? I thought people would be investing in NVIDIA 580GTX's rather than ATI.

What gives?

The3 fact that the NVidia is NOT "so much better" but "totally crap" for mining oprations. SO much worse that buying an Nvidia for mining is stupid. Check the numbers.
hongus
Full Member
***
Offline Offline

Activity: 736
Merit: 100


Adoption Blockchain e-Commerce to World


View Profile
October 10, 2011, 05:46:45 AM
 #7

ATI also has eyefinity. Some cards can hook up to like 5 monitors

Tmoney
Newbie
*
Offline Offline

Activity: 40
Merit: 0


View Profile
October 10, 2011, 06:08:50 AM
 #8

ATI also has eyefinity. Some cards can hook up to like 5 monitors

SIX SON!!!!!!
Tim the Magician
Member
**
Offline Offline

Activity: 96
Merit: 10


View Profile
October 10, 2011, 07:20:57 AM
 #9

One of my 5970s has 6x Eyefinity

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150500

GTX 580s are going for around $500 and get about 140 Mhash.. The last 10 min average on a 5970 which I bought new for $400 was 815 Mhash..

I always was an Nvidia fan though.  Until I started mining I never even owned a ATI card..
Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
October 10, 2011, 08:25:20 AM
 #10

Quote
WHy do people only buy ATIcards when NVIDA is somuch better?
Because nvidia is not "somuch better" but is CRAP for mining.

Quote
But wouldn't buying 2 3GB 580 GTX's pay off sooner than a ATI?
No. Nvidia will never pay off. They mining is so FAIL that you spend more in electricity than you gain from mining.


Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
October 10, 2011, 08:26:06 AM
 #11

And btw, i had an ATI even before discovering bitcoin, so uh, nvidia is not better even outside of bitcoin

TyGrr
Member
**
Offline Offline

Activity: 84
Merit: 10



View Profile
October 10, 2011, 08:37:42 AM
 #12

https://en.bitcoin.it/wiki/Mining_hardware_comparison
bulanula
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 10, 2011, 08:42:53 AM
 #13

And btw, i had an ATI even before discovering bitcoin, so uh, nvidia is not better even outside of bitcoin

It blatantly is. Nvidia is best for driver support ( ATI has shittiest drivers ever - 100% CPU bug STILL present now ), gaming performance and general stability and usage as normal GPU. Only get ATI if you want to compute. Avoid otherwise.
Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
October 10, 2011, 10:09:17 AM
 #14

And btw, i had an ATI even before discovering bitcoin, so uh, nvidia is not better even outside of bitcoin

It blatantly is. Nvidia is best for driver support ( ATI has shittiest drivers ever - 100% CPU bug STILL present now ), gaming performance and general stability and usage as normal GPU. Only get ATI if you want to compute. Avoid otherwise.
Fanboy detected. Worst driver are nvidia: remember the one with the fan bug that made the card MELT?

I have no 100% cpu bug and perfect gaming performance and stability.

Maybe i am a god?

bulanula
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 10, 2011, 10:39:21 AM
 #15

And btw, i had an ATI even before discovering bitcoin, so uh, nvidia is not better even outside of bitcoin

It blatantly is. Nvidia is best for driver support ( ATI has shittiest drivers ever - 100% CPU bug STILL present now ), gaming performance and general stability and usage as normal GPU. Only get ATI if you want to compute. Avoid otherwise.
Fanboy detected. Worst driver are nvidia: remember the one with the fan bug that made the card MELT?

I have no 100% cpu bug and perfect gaming performance and stability.

Maybe i am a god?


You're probably right about the fanboy thing. He doesn't even know about Catalyst apparently, much less using different versions and the effect it has.

CPU bug still present in all versions up to latest 11.9. Dumb lazy AMD developers did not bother to fix it. Don't even try and say that ATI drivers > Nvidia drivers ! ATI cards always lose on benchmarks because of shit drivers while Nvidia always has got optimized drivers etc. so better performance.
astana (OP)
Member
**
Offline Offline

Activity: 98
Merit: 10


View Profile
October 10, 2011, 11:18:21 AM
 #16

Well mining & cheaper prices is about the only good thing ATI/AMD is good for....I may be a noob to the bitcoin world but I'm not to the PC world, Nvidia kills ATI in performance/overclocking ability and yes it does come at a price but it's worth it.

Everyone want's the little guy to succeed including me, but the hard truth is Intel could buy AMD 10 times over if it was allowed to by the government, that's how far behind technology wise AMD is now... They are a mess, their new Bulldozer chips bench 100mhz faster than 2 year old sandy bridge chips  Roll Eyes

And you can get 580GTX's new for $350/$400 if you look hard enough/right contacts... Well I can at least Wink  So If you want one, I'll find you one for a commission Cheesy
Convery
Sr. Member
****
Offline Offline

Activity: 966
Merit: 254



View Profile
October 10, 2011, 12:17:23 PM
Last edit: October 10, 2011, 12:35:17 PM by Convery
 #17

CPU bug still present in all versions up to latest 11.9. Dumb lazy AMD developers did not bother to fix it. Don't even try and say that ATI drivers > Nvidia drivers ! ATI cards always lose on benchmarks because of shit drivers while Nvidia always has got optimized drivers etc. so better performance.

A new driver every 10 days that may burn your card (three driver versions did that) or performs sub-optimally in some games vs a new tested driver every month + some 'hotfixes' to get optimizations for some games before it's added to the final release.

Seriously, I bet you are one of those that say that a Nvidia GTX 460 is faster than a Radeon HD 6990 because 'hurr, the AMD card doesn't have any drivers'..

But ye, as you said;
Bug in AMD drivers = The AMD devs are too stupid or too lazy so they can't get it fixed.
Bug in NVIDIA drivers = The devs are working around the clock to find the best fix for it. They patched it within 10 sec of finding it ofcourse but they are just looking for the optimal way of doing it. Because they are the best.

*Fanboy detected*


Everyone want's the little guy to succeed including me, but the hard truth is Intel could buy AMD 10 times over if it was allowed to by the government, that's how far behind technology wise AMD is now... They are a mess, their new Bulldozer chips bench 100mhz faster than 2 year old sandy bridge chips  Roll Eyes

Bulldozer will find it's way in between the i5 2500 and i7 2600 and thus be placed in the upper tier of gaming oriented CPUs so just where AMD aims their products (No you don't need a i7 hexacore to play games). While it's indeed a bit late they made a processor on a new architecture that will lead the way for upcoming generations, not to mention that a new architecture takes time to develop and delays are unavoidable. Also, the FX series overclock like crazy with a pretty low price.

Not sure how that found it's way in in a NVIDIA vs. AMD thread but ye, Intel will probably always be better though they wont beat AMD at price/performance :3


             ▄          ▄▄▄▄    ▄
            ███      ▄██████▀  ▀█▀
            ███     ▄██▀
            ███     ███        ▄█▄   ▄█▄ ▄█████▄▄         ▄▄██████▄      ▄█▄ ▄█████▄▄         ▄▄█████▄▄        ▄▄█████▄▄
    ▄▄▄▄▄▄  ███     ███        ███   ██████▀▀▀▀███▄     ▄███▀▀▀▀▀███▄    ██████▀▀▀▀███▄     ▄███▀▀▀▀▀███▄    ▄███▀▀▀▀▀███▄
  ▄████████▄███  ▄█████████▄   ███   ████▀      ▀███   ▄██▀       ▀██▄   ████▀      ▀███   ▄██▀       ▀█▀   ▄██▀       ▀██▄
▄███▀    ▀█████   ▀▀███▀▀▀▀    ███   ███         ███   ███         ███   ███         ███   ███              ███████████████
███   ▄▄   ▀███     ███        ███   ███         ███   ███         ███   ███         ███   ███              ███▀▀▀▀▀▀▀▀▀▀▀
███   ▀▀   ▄███     ███        ███   ███         ███   ███         ███   ███         ███   ███         ▄    ███         ▄
▀███▄    ▄█████     ███        ███   ███         ███    ███▄▄   ▄▄████   ███         ███    ███▄▄    ▄███    ███▄▄   ▄▄███
  ▀████████▀███     ███        ███   ███         ███     ▀████████▀███   ███         ███     ▀█████████▀      ▀█████████▀
    ▀▀▀▀▀▀   ▀       ▀          ▀     ▀           ▀         ▀▀▀▀▀   ▀     ▀           ▀         ▀▀▀▀▀            ▀▀▀▀▀

       ▄▄▄▄▄▄▄
   ▄▄▀▀       ▀▀▄▄
  █               █ ▄
 █   █▀▄ ▀█▀ ▀█▀   █ ▀▄
 █   █▀▄  █   █    █  ▀▄
  █  ▀▀   ▀   ▀   █    █
▄▀ ▄▄           ▄▀    ▄▀
 ▀▀  ▀▀▄▄▄▄▄▄▄▀▀      ▀▄
        ▀▄▄      ▄▄▀▀▄▄▀
           ▀▀▀▀▀▀

                      ▄▄▄
  ▄█▄              ▄███████▄
  ▀████▄▄         ██████▀██████▀
    ▀▀▀████▄▄     ███████████▀
    ▀██▄███████▄▄███████████
     ▄▄▄▀██████████████████
      ▀████████████████████
▀█▄▄     ▀████████████████
  ▀████████████████▀█████
    ▀████████████▀▄▄███▀
       ▀▀██████████▀▀
           ▀▀▀▀▀

               ▄▄   ▄▄
              ▄▀ ▀▀█  █
             ▄▀     ▀▀
         ▄▄▄▄█▄
     ▄█▀▀▀▀▀▀▀▀▀▀█▄
 ▄▀▄▀              ▀▄▀▄
█  █   ▄█▄    ▄█▄   █  █
 ▀█    ▀█▀    ▀█▀    █▀
  █                  █
   █   ▀▄      ▄▀   █
    ▀▄   ▀▀▀▀▀▀   ▄▀
      ▀▀▄▄▄▄▄▄▄▄▀▀
New Age of DEFI
A Non-Code Platform for
Decentralized Trading Instruments

   ▄▄███████████████▄▄
 ▄█████████████████████▄
▄██████████████▀▀███████▄
████████████▀▀    ███████
█████████▀▀   ▄   ███████
██████▀▀     █    ███████
████▀       █     ███████
█████▄▄   ▄█      ███████
████████ ██▄      ███████
▀████████ ▀▄███▄▄███████▀
 ▀█████████████████████▀
   ▀▀███████████████▀▀

     ▄              ▄
   ▄███▄          ▄███▄
   █████▄  ▄▄▄▄  ▄█████
  ▄████████████████████▄
 ▄██████████████████████▄
 ████████████████████████
██████▀▀          ▀▀██████
█████▀   ▄      ▄   ▀█████
 ████   ███    ███   ████
  ████   ▀      ▀   ████
   ▀████▄▄▄▄▄▄▄▄▄▄████▀
     ▀▀████████████▀▀

   ▄▄████████████████▄▄
 ▄█████▀▀▀██████▀▀▀█████▄
▄████▀  ▀▀▀    ▀▀▀  ▀████▄
████▀                ▀████
███▀                  ▀███
███       ▄    ▄       ███
██▀      ███  ███      ▀██
██       ▀█▀  ▀█▀       ██
██▄     ▄        ▄     ▄██
▀██▄     ▀▀▄▄▄▄▀▀     ███▀
 ▀███▄▄▄▄▄▄████▄▄▄▄▄▄███▀
   ▀▀████████████████▀▀
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
October 10, 2011, 12:47:14 PM
 #18

And you can get 580GTX's new for $350/$400 if you look hard enough/right contacts... Well I can at least Wink  So If you want one, I'll find you one for a commission Cheesy

Why would anyone want one even at that price?  You can get a 5970 for $400 and get 800MH/s.  for 580GTX to be comparable it would need t o be <$100.  Let me know when you can get a 580GTX for $100.  Even then it woulnd't be worth it given 580GTX burns electricity like it is going out of style.
Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
October 10, 2011, 01:06:30 PM
 #19

And btw, i had an ATI even before discovering bitcoin, so uh, nvidia is not better even outside of bitcoin

It blatantly is. Nvidia is best for driver support ( ATI has shittiest drivers ever - 100% CPU bug STILL present now ), gaming performance and general stability and usage as normal GPU. Only get ATI if you want to compute. Avoid otherwise.
Fanboy detected. Worst driver are nvidia: remember the one with the fan bug that made the card MELT?

I have no 100% cpu bug and perfect gaming performance and stability.

Maybe i am a god?


You're probably right about the fanboy thing. He doesn't even know about Catalyst apparently, much less using different versions and the effect it has.

CPU bug still present in all versions up to latest 11.9. Dumb lazy AMD developers did not bother to fix it. Don't even try and say that ATI drivers > Nvidia drivers ! ATI cards always lose on benchmarks because of shit drivers while Nvidia always has got optimized drivers etc. so better performance.
Lololol, fanboy detected.

Why i have no cpu bug?

Why my ATI doesn't lose on benchmarks and in gaming?

Why i have optimized drivers for Battlefield3 and Rage?

Oh and ati driver better than nvidia, it's a fact. Unless you are saying driver that make the card MELT are better  Undecided

DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
October 10, 2011, 01:39:20 PM
 #20

Gabi you aren't helping.

If you have 2+ GPU the 100% CPU bug is still present in 11.9 and 11.10 drivers.
If you have 1 AMD GPU then there is no 100% CPU bug in 11.9 (bug only existed for single GPU from 11.6 to 11.Cool.
This has been confirmed by AMD Developers on the AMD OpenCL forum.

ALSO DON'T FEED THE TROLLS.  The "Amd fanboy vs Nvidia fanboy" war has waged for almost two decades without any sign of capitulation by either side.  It has been waged across countless forums and usenets.  Hell the first salvo was likely fired on a long since forgotten bulletin board using a 2400 baud modem. 

Nothing you could possible say will end this war here so why drag Bitcointalk into the mud?
Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
October 10, 2011, 01:58:06 PM
 #21

You are right. Sorry.

But you know, war... war never changes  Cheesy

DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
October 10, 2011, 02:05:36 PM
 #22

Ok you made up for it by managing to pull that classic computer game quote into the thread.

dark_silverstar
Member
**
Offline Offline

Activity: 76
Merit: 10



View Profile
October 10, 2011, 02:08:19 PM
 #23

well because the price are cheap and good for gaming. selling my nvidia card because so overprice with average performance in gaming even mid end card overprice like hell  Angry
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
October 10, 2011, 02:36:06 PM
 #24

So a natural follow up to this discussion would be:

What sort of hash algorithm could be relatively platform neutral?  (This would be kind of ideal in honesty so hardware platform is not as much of a concern and free market pricing and competition work more smoothly)

OR - What sort of hash algorithm could be pro-Nvidia architecture?

It would be hard I suspect to get Bitcoin to ammend their hash algorithm with something like this BUT if it were to ever happen it might be good to have an idea of some options...

My understanding is that Nvidia underperforms AMD in all current cryptographic hashing algorithms.  Hackers and password crackers all over the world use AMD GPU exclusively.  Whitepixel for example is an open source MD5 crackers which has vastly superior performance on AMD GPU (roughly 400% higher throughput when normalized for price).

There are a couple reasons for this
1) Prior to Fermi GPU Nvidia chips lacked 32bit integers internally.  This vastly slows down computations on 32bit numbers.  Given 32bit int is an industry standard for CPU architecture I don't know of any cryptographic hash which doesn't use 32 or 64 bit numbers internally.

2) Nvidia GPU lack certain instructions that allow hashing to be completed in less steps.  There is no reason Nvidia couldn't add these fast operators in the future but until now cryptographic (and integer performance in general) wasn't an important metric.

3) Nvidia architecture is based on the concept of fewer but more powerful shaders where AMD is based on the concept of more but simpler shaders.  AMD architecture simply fits better with concept of hashing where multiple simple operators are performed on an operand.

I don't think AMD designed their GPU to be good at cryptography.  They simply happened to be more efficient at cryptographic functions than NVidia's GPUs are.  My guess is now that the market has seen what performance can be gained by using GPU for integer operations and as GPGPU becomes more common the market will demand Nvidia have better integer performance and better cryptography efficiency.   Some futue Nvidia GPU will likely overcome the current shortcoming of the current architecture.
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 10, 2011, 04:32:59 PM
Last edit: October 10, 2011, 04:52:42 PM by JackRabiit
 #25

And btw, i had an ATI even before discovering bitcoin, so uh, nvidia is not better even outside of bitcoin

It blatantly is. Nvidia is best for driver support ( ATI has shittiest drivers ever - 100% CPU bug STILL present now ), gaming performance and general stability and usage as normal GPU. Only get ATI if you want to compute. Avoid otherwise.

Is that a fucking troll?
Lets see you use OpenCL---OH WAIT YOU FUCKING CANT.
Okay how about eye-- SHIT YOU CANT DO THAT EITHER
What about 3D gami---- OH YOUR GAMES NEED TO BE PROGRAMMED FOR IT?
Okay.. Lets stop bashing you.. Can your $500 GTX580 get >200Mh/sec? NOU? IT CANT? Thats funny, My $120 5830 gets 306mh/sec
Sooo lets see here... Nvidia is better how?
A Shader clock? AKA How Nvidia is rigging the market by putting All new graphics tech on the "shader" clock. Cool.
Can you do Tessalation? NOU? COOLSTORYBRO.

Nvidia is Not. IMO, In ANYWAY SHAPE OR FORM, Better Than AMD cards.
Heres what Nvidia is doing.
SSAO=Shader clock
AO=Shader clock
Shadows=Shader clock
Bumpmapping=Shaderclock
Lighting effects=Shaderclock
Reflections=Shaderclock
AA=Coreclock
Textures=Memclock
Engine=Coreclock
Physics=Core+Mem+Shader


Thier prices are 2x what they should be, And all this Bullshit about drivers? Are you FUCKING kidding me. Last time i had a driver issue with ATI, Was went i went crossfire, And i was Fucking 14, And What did i do to solve it? Uninstall Reinstall,
OH THATS SUCH A BUG?!!!! LETS FLAME ATI!.

If you lived nearby me I would fucking smack you for not knowing what your talking about.

They are programming all modern games to use the shader clock, A clock that NO amd card has.

Think about it. If AMD WAS ALLOWED (WICH NVIDIA HAS PUT A BLANKET LAWSUIT POLICY OVER DOING THIS) To Put a Shader Clock, Into thier cards, GOODBYE nvidia.
Nvidia has label'd Shader Clock's "Technolgical Property of Nvidia", Effectively Banning AMD from using a Shader clock

And with Nvidia writing the code for all the new games to use the SHADER CLOCK LIKE A FUCKING WHORE, It makes ATI cards Seem to perform worse.

They will perform Terrible Bad on a Nvidia Flagship game. No doubt, No denial.
(old) Dual 5770xfire= TWENTY fps on NFSShift, NO MATTER THE GRAHPICS SETTINGS. Max and Nuked gave same frame rates.
But Dual 5770's on Dirt2= Max graphics(less the crowd set to low)= Frame Rate NEVER dropped below !50!

Now you tell me wich of those video games Graphically looks better.

It's All About The Method Of Testing Performance
And Heres A Fuckton of methods
http://alienbabeltech.com/main/introducing-the-worlds-fastest-graphics-card-amds-flagship-hd-6990/all/1

NOT TO MENTION, THAT THE RADEON 6990 IS THE WORLDS FASTEST GPU. ONLY NVIDA FANBOYS WOULD SAY OTHERWISE

And before you drabble at me about the Nvidia's calims of they're new card being faster, Thats if you've got TWO of them VS One 6990

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
Meatball
Sr. Member
****
Offline Offline

Activity: 378
Merit: 250



View Profile
October 10, 2011, 06:07:04 PM
 #26

I don't get this, is it price? I thought people would be investing in NVIDIA 580GTX's rather than ATI.

What gives?

I smell a troll.  I don't see why anyone would even post something like this when even a few minutes of reading on the boards/research would clearly show ATI outperforms NVidia in mining.  Seriously people, spend a few minutes looking around before you post.
wndrbr3d
Hero Member
*****
Offline Offline

Activity: 914
Merit: 500


View Profile
October 10, 2011, 06:17:02 PM
 #27

I don't get this, is it price? I thought people would be investing in NVIDIA 580GTX's rather than ATI.

What gives?

I smell a troll.  I don't see why anyone would even post something like this when even a few minutes of reading on the boards/research would clearly show ATI outperforms NVidia in mining.  Seriously people, spend a few minutes looking around before you post.

+1
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 10, 2011, 06:36:47 PM
 #28

Shit man, not any fan here but, I have only 4 or 5 pc games, and one of my favorites I like to play, I cannot even play in crossfirex, Cod world at war, freezes up on map load, not loading map, when the map loads, the game freezes, cod4 did the same, found a fix for it, gotta have x16AA enabled which is bs, drops my frame rates down significantly as if i was using just one 5830 because I've gotta have AA enabled @ 16 to play, whats the point of crossfirex if I gotta enable this, btw this little fix don't work for WAW. I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all.

I pick ATI couple years ago because of price/performance, then a 5830 ended up in my hands, led to mining, led to having two 5830's.

I had a nvidia 6600 couples years back, and that 6600 still performed better on one of the oldest games I've got, cod1 than even using 2 5830's.

It's a choice of preference in what card you want or have expenses you have to get what you want.

Ionno, there is just something about ATI/AMD cards, I don't like using for gaming, somehow frame rates get locked. Nvidia can/could release extra frame rates for the games I play.

This is just my opinion, and please don't bash me for my opinion Smiley.

I'll stick with ATI/AMD for bitcoin mining, once I am able to get a hold of a high end nvidia card, I'm going for it, strictly for PC Gaming.
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 10, 2011, 06:57:52 PM
 #29

Shit man, not any fan here but, I have only 4 or 5 pc games, and one of my favorites I like to play, I cannot even play in crossfirex, Cod world at war, freezes up on map load, not loading map, when the map loads, the game freezes, cod4 did the same, found a fix for it, gotta have x16AA enabled which is bs, drops my frame rates down significantly as if i was using just one 5830 because I've gotta have AA enabled @ 16 to play, whats the point of crossfirex if I gotta enable this, btw this little fix don't work for WAW. I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all.

I pick ATI couple years ago because of price/performance, then a 5830 ended up in my hands, led to mining, led to having two 5830's.

I had a nvidia 6600 couples years back, and that 6600 still performed better on one of the oldest games I've got, cod1 than even using 2 5830's.

It's a choice of preference in what card you want or have expenses you have to get what you want.

Ionno, there is just something about ATI/AMD cards, I don't like using for gaming, somehow frame rates get locked. Nvidia can/could release extra frame rates for the games I play.

This is just my opinion, and please don't bash me for my opinion Smiley.

I'll stick with ATI/AMD for bitcoin mining, once I am able to get a hold of a high end nvidia card, I'm going for it, strictly for PC Gaming.

Just a noob question here:
Have you gone into the CCC and:
Set all to application controlled.
Turn OFF AMD optimised surface format for textures and tessalation
(vSyncers use this) Turn on OpenGL triple buffering

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 10, 2011, 07:15:08 PM
 #30

Just a noob question here:
Have you gone into the CCC and:
Set all to application controlled.
Turn OFF AMD optimised surface format for textures and tessalation
(vSyncers use this) Turn on OpenGL triple buffering

I've tried every setting possibly could be done.

This is where I found the fix for the cod4 freezing post #7

http://forums.steampowered.com/forums/showthread.php?s=62de95079276636814aa65273ec2a093&p=21474117#post21474117

Post #7 was the fix for cod4, nothing can be done for waw. There are a ton of threads going around about 5830's crossfirex and games freezing, and amd hasn't, from what i see, hasn't done a thing about it.
http://forums.steampowered.com/forums/showpost.php?s=62de95079276636814aa65273ec2a093&p=21474117&postcount=7
Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
October 10, 2011, 07:39:43 PM
 #31

I had an ATI 1900, no problems.

Then had an ATI 3870, no problems.

Now i have an ATI 6950, once more no problems at all.

Everything work fine. All games, everything.

saethan
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
October 10, 2011, 08:59:07 PM
 #32

I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all.

So, what monitor are you using where fps that high actually matters?

And don't get me started on what the human brain is actually able to process...
Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
October 10, 2011, 09:02:53 PM
 #33

It seems on some game (like quake 3 and maybe some cod) if you have like 333 fps you can do more things, jump more or something like that.....

DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
October 10, 2011, 09:05:15 PM
 #34

Imaginary land.

Funny bashing a card for "only" getting 110fps when virtually all LCD only display 60 frames per second anyways.  Very few are 72 or 75Hz even then 100fps is more than enough to drive them to their limit.  Remember an LCD is a physical device it does take some time to physically pivot the liquid crystal (via electrical impulse) and alter how much light that is emitted.  Doesn't matter how many fps a videocard can create in memory it still takes time for the actual crystal to align.
Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
October 10, 2011, 09:15:44 PM
 #35

Yes but it's not about the monitor, it's about game physics glitches and fps.

saethan
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
October 10, 2011, 09:19:02 PM
 #36

If for some reason a game has a difference in how it works between 110fps and 333fps, there is some sort of crazy programming going on in the background, unless the difference is being caused by the CPU and not the GPU, or it's PhysX compatibility problems among the ATI/AMD cards.

[edit] One last thing, did crossfire even -exist- when Q3 was released?  That's also the same graphics engine used in CoD1.  The reason I ask is the person I originally replied to was comparing a 6600 nvidia to crossfire 5830s in CoD1.

Though a single 5830 should still smoke a 6600 nvidia unless the Q3 engine was designed with Nvidia specs in mind (which they did for Quake 2 and 3dfx - my voodoo 3 outperformed much faster GeForces in Quake 2 for years - so I wouldn't be surprised if they did the same for Quake 3 and Nvidia).
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 10, 2011, 10:26:58 PM
Last edit: October 10, 2011, 10:47:11 PM by pekv2
 #37

I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all.

So, what monitor are you using where fps that high actually matters?

And don't get me started on what the human brain is actually able to process...

I'm using a 17" monitor. This is not about what I can see. I clearly said in my post to take advantages of the games I play.

It seems on some game (like quake 3 and maybe some cod) if you have like 333 fps you can do more things, jump more or something like that.....

Gabi has it correct, with 125 or 333 FPS, you can jump higher, get onto ledges, boxes, taken advantage of the games.

This isn't console, this is PC Gaming at the best.

Imaginary land.

Funny bashing a card for "only" getting 110fps when virtually all LCD only display 60 frames per second anyways.  Very few are 72 or 75Hz even then 100fps is more than enough to drive them to their limit.  Remember an LCD is a physical device it does take some time to physically pivot the liquid crystal (via electrical impulse) and alter how much light that is emitted.  Doesn't matter how many fps a videocard can create in memory it still takes time for the actual crystal to align.

Well, that will be your little secret.

Yes but it's not about the monitor, it's about game physics glitches and fps.

Gabi is correct once again. Smiley

If for some reason a game has a difference in how it works between 110fps and 333fps, there is some sort of crazy programming going on in the background, unless the difference is being caused by the CPU and not the GPU, or it's PhysX compatibility problems among the ATI/AMD cards.

[edit] One last thing, did crossfire even -exist- when Q3 was released?  That's also the same graphics engine used in CoD1.  The reason I ask is the person I originally replied to was comparing a 6600 nvidia to crossfire 5830s in CoD1.

Though a single 5830 should still smoke a 6600 nvidia unless the Q3 engine was designed with Nvidia specs in mind (which they did for Quake 2 and 3dfx - my voodoo 3 outperformed much faster GeForces in Quake 2 for years - so I wouldn't be surprised if they did the same for Quake 3 and Nvidia).

The game might be designed like that, but I've always felt ATI/AMD locked something down from getting high FPS from burning up the card. This was also speculated towards ATI/AMD while using Furmark benchmark program.

But all in all, these little fps tricks work for COD1 & United offensive, COD2, COD4, CODWAW, BO, and most likely what ever other game that used the same gaming engine.

Edit:
On another note:
GTX 480/490 GTX 580/580 x2/590 would throw cod games over 1000 FPS, and if PB is enabled, you will get kicked for it, but if PB is disabled, there will be no problem. AMD 6990 would probably get 250 fps in cod games.
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 10, 2011, 11:06:51 PM
 #38

Who said anything about having Vsync actually enabled?, Anyone with 61+fps usually has vsnc off

Fuck lol im hitting 380fps avrg of CS:S Maxxxed out

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 10, 2011, 11:25:35 PM
 #39

Who said anything about having Vsync actually enabled?, Anyone with 61+fps usually has vsnc off

Fuck lol im hitting 380fps avrg of CS:S Maxxxed out

Out of all the time I've heard of counter strike, I've actually never played it. Is it really good?
saethan
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
October 10, 2011, 11:57:56 PM
 #40

http://en.wikipedia.org/wiki/IW_engine

Not exactly popular outside the CoD line.  Smiley

I'm going to bet taking advantage of something weird in particular game engines that means a real difference between 110 and 333 fps(I take this comparison from the 110 333 mentioned earlier) is a very small niche in the overall gaming graphics market.  But hey, if you'll throw the cash at it, more to ya.

Jack: vsync or not, you're only seeing however many of those frames your monitor is capable of displaying.  If 380fps means you can do funky tricks people with 90fps can't, then there's something screwy with the engine imho(or it's just old enough that, when designed, they didn't even bother considering multiple-hundreds of fps), but when it comes to what you're actually seeing, you're only seeing what vsync would show you.
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 11, 2011, 12:33:42 AM
 #41

http://en.wikipedia.org/wiki/IW_engine

Not exactly popular outside the CoD line.  Smiley

I'm going to bet taking advantage of something weird in particular game engines that means a real difference between 110 and 333 fps(I take this comparison from the 110 333 mentioned earlier) is a very small niche in the overall gaming graphics market.  But hey, if you'll throw the cash at it, more to ya.

Jack: vsync or not, you're only seeing however many of those frames your monitor is capable of displaying.  If 380fps means you can do funky tricks people with 90fps can't, then there's something screwy with the engine imho(or it's just old enough that, when designed, they didn't even bother considering multiple-hundreds of fps), but when it comes to what you're actually seeing, you're only seeing what vsync would show you.

Point taken. Was about to start going "Dude theres a such thing as Vtears" And went "oh lol, Right,they're thier because of that"

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
astana (OP)
Member
**
Offline Offline

Activity: 98
Merit: 10


View Profile
October 13, 2011, 10:32:25 AM
 #42

Hey guys no troll intended I just wondered why people use AMD/ATI and now I know. Wink


Everyone want's the little guy to succeed including me, but the hard truth is Intel could buy AMD 10 times over if it was allowed to by the government, that's how far behind technology wise AMD is now... They are a mess, their new Bulldozer chips bench 100mhz faster than 2 year old sandy bridge chips  Roll Eyes

Bulldozer will find it's way in between the i5 2500 and i7 2600 and thus be placed in the upper tier of gaming oriented CPUs so just where AMD aims their products (No you don't need a i7 hexacore to play games). While it's indeed a bit late they made a processor on a new architecture that will lead the way for upcoming generations, not to mention that a new architecture takes time to develop and delays are unavoidable. Also, the FX series overclock like crazy with a pretty low price.

Not sure how that found it's way in in a NVIDIA vs. AMD thread but ye, Intel will probably always be better though they wont beat AMD at price/performance :3
[/quote]

Well ATI is owned by AMD, Intel would buy Nvidia but anti trust law suits would come as soon as the merger.

Problem is Intel is going to drop the 2500K to $150/$125 when sandy bridge-E X79 comes out, and at that price point it smokes Bulldozer pound for pound. I love AMD, I wish they would be like they were back in the X2 64 days when they pushed Intel hard to compete with them but the way Bulldozer turned out people are going to be losing their jobs  Embarrassed. It's not as bad as some thought, but really they should of done better.

I would never buy first release/gen technology, you always lose out because it takes them 6-12 months to iron out the bugs/bios updates and costs you a premium.
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 13, 2011, 05:23:17 PM
Last edit: October 14, 2011, 03:06:11 PM by JackRabiit
 #43

Hey guys no troll intended I just wondered why people use AMD/ATI and now I know. Wink


Everyone want's the little guy to succeed including me, but the hard truth is Intel could buy AMD 10 times over if it was allowed to by the government, that's how far behind technology wise AMD is now... They are a mess, their new Bulldozer chips bench 100mhz faster than 2 year old sandy bridge chips  Roll Eyes

Bulldozer will find it's way in between the i5 2500 and i7 2600 and thus be placed in the upper tier of gaming oriented CPUs so just where AMD aims their products (No you don't need a i7 hexacore to play games). While it's indeed a bit late they made a processor on a new architecture that will lead the way for upcoming generations, not to mention that a new architecture takes time to develop and delays are unavoidable. Also, the FX series overclock like crazy with a pretty low price.

Not sure how that found it's way in in a NVIDIA vs. AMD thread but ye, Intel will probably always be better though they wont beat AMD at price/performance :3


Y'know... I was about to flame the fuck outta you for saying that the bulldozer's were going to "settle in" between the i5-i7 high enders.
But then i kept reading and noticed that you know what the fuck your talking about lol.
If Game/Program coding stays the same. Then Yes, I could totally see the Bulldozer Zambezi chips being a failure, However. They're are two words that always seem to go "haha"
Moores Law.
As it stands.. I do not belive that the bulldozer will "settle" in where you estimate it to, I think it if they get it right... And they use all 8cores properly. Then No Way, is it "weak as or strong as" the Intel i7EX.
I'll admit im an AMD fanboy. (loved the athlonIIx64's vs Intel core2)

Wait where am i going with this.....
Oh. if AMD doesnt do some good coding for thier cpu's Then yeah, it's gonna perform like an i7.
But considering the 16core Instanbul... i think they'll get the coding done just fine.... After a year LOL.

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 13, 2011, 05:40:48 PM
 #44

I don't get this, is it price? I thought people would be investing in NVIDIA 580GTX's rather than ATI.

What gives?

Oh. Question for You. What caused you to think of Any reason as to why people would be picking GTX580's.... Dual GTX580's in SLI= 5-11more fps than One 6990

Nvidia argues that the 6990 is not one video card, And that it is nothing more than two 6970's togeather
So nvidia feels they have the right to run two of thier card against the 6990.

UHHMMM DERP? the 6990 fills One slot, And can go Quadfire without issue.
Thats 8GPU's And Four cards.

Nvidia would only beable to do QuadSLI and get 4GPUs

Lol=Half the strength

Nvidia believes that the 6990 is not """A""" video card because it has two GPU's, This is why they claim that thier Single Gpu GTX580 is better. HOWEVER, YOU NEED TWO GTX580'S!!!! to match ONE 6990.

Or In laymens terms.
6970+6970=THE 6990.
GTX580=GTX580
GTX580+GTX580=Two video cards

They constantly claim that the 6990 is not one video card, But that it is infact Two video cards.

Note:One GTX580 beats the 6970. But were not talking about ""A"" 6970 Were talking about """A""" 6990

So heres what Nvidia's claims are Acutally saying.

Nvidia "our GTX580 is better than your HD6970x2 because you need two of them to make One 6990"
AMD "Dude, It fits in one slot, It can Quadfire with no issues, And they're is not such tech to go beyond quadfire, So your claiming that were placing Eight cards into a "crossfire" allignment, Wich is physcially impossible to do"
Nvidia "It's two 6970's in crossfire inside a box plugged into one slot"
AMD "And whats the problem? Whats "WRONG" about two cards being compressed into One card."
Nvidia "It's not One GPU it's Two GPU's So that Two Graphics Proccesing Units"
AMD "Two GPU's Combined into One Card, That fits into One slot, That Crossfires and Quadfires without issue, Prooving that it's one card, Because it's theirs no such thing as 8way crossfire"
Nvidia "But our one GPU beats your one GPU"
AMD "Thats not the point, This is about the faster graphics card, If i get shit done Faster, Better, And fit into the same slot, I dont see why im Void of being called ""A"" GPU Because i dont think you can stick Two Different GPU's into the same PCI-E slot at the same time"

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
October 13, 2011, 08:11:26 PM
 #45

Lol, nvidia is so funny


DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
October 13, 2011, 08:19:32 PM
 #46

If I were Nvidia I would do the same thing.  It is called marketing.

AMD claims to have the fastest graphics card.
Nvidia claims to have the fastest GPU.

For 99.99999999999999999999999999999999999999999999999% of consumers it means the same thing.

This gives Nvidia an even fight in marketing department.  If they outmarket AMD then a majority of consumers will think Nvidia has the fastest card.

Nvidia isn't lying just relying on the fact that most consumers are "mentally lazy".

Honestly if I was an Nvidia shareholder and they WEREN'T doing this I would be pissed because it is their job to mazimize shareholder wealth and creating value from the technicallity that they have the fastest chip is one way to do that.

Reality check.  most consumers don't care about TDP, and slot designs, reference coolers, shader couts, ALU efficiency, etc.  They want a "fast" (can't even really define that) card so they can go "pew pew" and shoot their friends online.  If they think Nvidia is "fast" then they will buy an Nvidia even if their budget only allows them to get is a 550 GTX Ti.  550 is close to 580 right?   Grin
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 13, 2011, 11:21:36 PM
 #47

If I were Nvidia I would do the same thing.  It is called marketing.

AMD claims to have the fastest graphics card.
Nvidia claims to have the fastest GPU.

For 99.99999999999999999999999999999999999999999999999% of consumers it means the same thing.

This gives Nvidia an even fight in marketing department.  If they outmarket AMD then a majority of consumers will think Nvidia has the fastest card.

Nvidia isn't lying just relying on the fact that most consumers are "mentally lazy".

Honestly if I was an Nvidia shareholder and they WEREN'T doing this I would be pissed because it is their job to mazimize shareholder wealth and creating value from the technicallity that they have the fastest chip is one way to do that.

Reality check.  most consumers don't care about TDP, and slot designs, reference coolers, shader couts, ALU efficiency, etc.  They want a "fast" (can't even really define that) card so they can go "pew pew" and shoot their friends online.  If they think Nvidia is "fast" then they will buy an Nvidia even if their budget only allows them to get is a 550 GTX Ti.  550 is close to 580 right?   Grin
+1
Story of human trading

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 14, 2011, 12:04:35 AM
 #48

Info on cod fps.

http://wiki.modsrepository.com/index.php/Call_of_Duty_:_A_Study_on_FPS
RyNinDaCleM
Legendary
*
Offline Offline

Activity: 2408
Merit: 1009


Legen -wait for it- dary


View Profile
October 14, 2011, 03:53:07 AM
 #49

My 6950 new, cost $300. Unlocked to a 6970 and performs 10% less than a 580, for <¾ the price.

GPU's are priced accordingly for there performance. 580 is more expensive than a 6970 for this reason. nVidia DOES have the 590, which IS better than a 6990, though only slightly and at *much* higher power consumption. (To me, this is nVidias biggest drawback!)

Both teams have driver issues. They both suck in that aspect.

And, 2x 6990=Quad fire!

Gabi
Legendary
*
Offline Offline

Activity: 1148
Merit: 1008


If you want to walk on water, get out of the boat


View Profile
October 14, 2011, 11:44:36 AM
 #50

Who cares about cod? We have BATTLEFIELD 3

pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 14, 2011, 12:00:08 PM
 #51

Who cares about cod? We have BATTLEFIELD 3

I was leaning the link towards saethan.
https://bitcointalk.org/index.php?topic=47507.msg566804#msg566804
But next time I will quote Smiley.

Second, Bf3 is shit. I won't even go there for the reasons of this. Have fun being raped by EA/Dice.
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 14, 2011, 03:10:10 PM
 #52

Who cares about cod? We have BATTLEFIELD 3

I was leaning the link towards saethan.
https://bitcointalk.org/index.php?topic=47507.msg566804#msg566804
But next time I will quote Smiley.

Second, Bf3 is shit. I won't even go there for the reasons of this. Have fun being raped by EA/Dice.
Ten Moar Points To Victory, EA games either Smash the game outta the park and it's Awesome.
Or it's a FUCKING PEICE OF SHIT


Whats this about the GTX590 being faster than dual 6970's(The 6990)

Also, Just did some studying.. And yeah... No reports in either direction of weather or not some dude has hooked up four 6990's

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 14, 2011, 03:20:30 PM
 #53

Just for the record, nVidia also has the 100% CPU bug. At least it has on my old ubuntu rig thats temporarily running a 8800GT. 100% load.

The LT
Full Member
***
Offline Offline

Activity: 186
Merit: 100



View Profile WWW
October 14, 2011, 03:36:51 PM
 #54

Wow, is this really worth a discussion?
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 15, 2011, 01:24:12 AM
 #55

Wow, is this really worth a discussion?
not anymore

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
bronan
Hero Member
*****
Offline Offline

Activity: 774
Merit: 500


Lazy Lurker Reads Alot


View Profile
October 15, 2011, 11:04:13 AM
Last edit: October 15, 2011, 01:53:28 PM by bronan
 #56

lol
my 2 cents on this is easy

NIVIDIA suxx big time i have had the most dying cards from nvidia, the worst drivers, and is the biggest scammers with endless rebranding the same product.
Now even today after 4 months of use another card died and again a nvidia crap. So the score this far NVIDIA 5 out of 8 died, dead, kaput, gone
ATI only 1 really died out of 27 true 1 other has been replaced but it was still working even with 110 c temps.
and yes when you use these cards when overclocking in time they will slow down, but then again you wanted to overclock and in most cases they will not die completely.
This far all the cards except the dead one, are still working but not overclocked by family and friends and all are happy with my old cards.
Yes ati needs to put some more money on the driver design which in my view will pay off big time, but i do favor any ati above all nvidia only on the low budget cards i say it does not matter which you buy.

Sure nvidia works on a few games better on their product but YOU PEOPLE must understand those games are totally made for these cards and the makers make sure ati will never run better then the paying big time scammer nvidia.
Yes nvidia pays them a lot of money for keep their product fastest, in products where no cards are favored by the secret donations ( or whatever you wanna call the payements made by nvidia )  you see a totally different score.
Now yes some games will benefit from one or the other but to call ATI crap is way too stupid the parts ati uses are way better quality as nvidia is doing, hence the nice cheap capacitors who blew up. ATI has been using the best japanese ones as far as i know. And again has a way lower dying rate

So to end this discussion NVIDIA sells crap period.
shakaru
Sr. Member
****
Offline Offline

Activity: 406
Merit: 250


QUIFAS EXCHANGE


View Profile
October 16, 2011, 12:04:16 AM
 #57



Sure nvidia works on a few games better on their product but YOU PEOPLE must understand those games are totally made for these cards and the makers make sure ati will never run better then the paying big time scammer nvidia.
Yes nvidia pays them a lot of money for keep their product fastest, in products where no cards are favored by the secret donations ( or whatever you wanna call the payements made by nvidia )  you see a totally different score.
Now yes some games will benefit from one or the other but to call ATI crap is way too stupid the parts ati uses are way better quality as nvidia is doing, hence the nice cheap capacitors who blew up. ATI has been using the best japanese ones as far as i know. And again has a way lower dying rate

So to end this discussion NVIDIA sells crap period.


Um....what?! You win the award for #IDontHaveTheFactToBackThisUp

You need to do some research into what DirrectX and OpenGL are. What a function call is. How a driver affects hardware and what rebranding is (ie. Buy a emachine and slap AndyComp on it. That is rebranding. Buying a design, building it on Pcb and fabrication and selling it, not rebranding)

                             ▄▄▄████████▄▄▄
                         ▄▄██████████████████▄▄
                       ▄███████▄▄▄▄▄▄▄▄▄▄███████▄
                     ▄█████▄▄██████████████▄▄█████▄
        ██████  █████████▄████████████████████▄█████
        ██████  ███████▄████████▄▄▄▄▄▄▄▄████████▄████
                      ▄██████▀████████████▀██████▄████
███████   █████████████████████████████████████████████
███████   █████████████████████████████████████████████
                   ████████████████████████████████████
     ██████████████████████████████████████████████████
     ██████████████████████████████████████████████████
                     █████████████████████████████████
            ██████████▀██████▄████████████▄██████▀████
            ███████████▀████████▀▀▀▀▀▀▀▀▀▀███████▄███
                    █████▀████████████████▄▀██████▄
                     ▀█████▀▀██████████████▀██▀██████▄
                       ▀███████▀▀▀▀▀▀▀▀▀▀███████▀▀▀▀▀▀
                         ▀▀██████████████████▀▀
                             ▀▀▀████████▀▀▀
QUIFAS                    
                    ███
 █              ███ ███
 █              ███  █
███          █  ███
███         ███  █
███  █      ███  █
    ███  █  ███  █
    ███ ███  █   █
     █   █   █
     █      
Nesetalis
Sr. Member
****
Offline Offline

Activity: 420
Merit: 250



View Profile
October 16, 2011, 12:48:09 AM
 #58

the cpu bug thing? yeah, I had that.... on my Nvidia GT 240... 100% CPU usage on 1 core while mining.
now with my Radeon 6790 I have 0-1% cpu usage...

ZOMG Moo!
Transisto
Donator
Legendary
*
Offline Offline

Activity: 1731
Merit: 1008



View Profile WWW
October 16, 2011, 01:34:17 AM
 #59

Please, Just Stop.
bronan
Hero Member
*****
Offline Offline

Activity: 774
Merit: 500


Lazy Lurker Reads Alot


View Profile
October 16, 2011, 08:25:09 AM
 #60

lol not enough bite the bait, no fun Cheesy
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 16, 2011, 09:23:55 PM
 #61

Please, Just Stop.
+1
Sick of seeing this come back up
Just stop.
Last.

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
Sargasm
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
October 17, 2011, 03:03:59 AM
 #62

I think this thread is pretty funny.

Nvidia cards tend to be a lot prettier for gaming.  I have had both tri fire 5970+5870 and now quadfire 5970s and I'd definitely give the pure smooth sexiness award to nvidia.  Fucking tearing, ATI, wtf. Well... Tearing, stutters and screen flickers really.

If I weren't making money with my cards, they'd be kinda dumb.

ALTHOUGH as a caveat... The 69xx series by ATI is a far the smoother renderer for games.  Competitive with nvidia even.
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 17, 2011, 04:10:27 AM
 #63

Tearing, stutters and screen flickers really.

Yup, same here, sick of the bs. Don't know how it works for nvidia, but ati's powerplay is fucking stupid, that causes screen flickering. Only way to fix/disable that shit, use MSIAfterburner or hack into the ati driver and set something up to disable it. But, gawd lordy, I hope nvidia don't have that shit or nvidia gives end user the choice to easily disable PowerPlay.

There are a whole lot of people that agree on how stupid powerplay is with ati/amd cards.

Btw, sapphire trixx programmer seems not to care to implement to disable powerplay like msiafterburner has.
Sargasm
Member
**
Offline Offline

Activity: 112
Merit: 10


View Profile
October 17, 2011, 04:16:27 AM
 #64

The idea is novel, but implementation is horse shit.

ATI is working from behind though (and doing well on the whole) I got a 5790 for 310 off ebay thats still WELL worth the price.  Nvidias OCD style attention to detail got a shitload of my money for years.  AMD is doing a half decent job of catching up, but Nvidia (much like Intel recently) has done a spectacular job of keeping their rendering on screen limited to that which is smooth rather than solely that which is fast.
Nesetalis
Sr. Member
****
Offline Offline

Activity: 420
Merit: 250



View Profile
October 17, 2011, 07:51:47 AM
 #65

since ATI was eaten by AMD.. AMD has released and opensourced much of the drivers.. and look like they are moving toward releasing them all. This will make AMD's drivers FAR better than nvidias in the long run.
with open specs, open drivers, and hundreds of thousands of eyes looking at the code, they will get fixed and working much quicker.... well, if you're on linux Tongue but it will roll over to windows too.

ZOMG Moo!
P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 17, 2011, 08:08:43 AM
 #66

since ATI was eaten by AMD.. AMD has released and opensourced much of the drivers..

No they havent. They have released partial specs for older cards so the community has been able to build usable drivers. Not great drivers, but usable. Well, if you dont game that is.

Quote
and look like they are moving toward releasing them all. This will make AMD's drivers FAR better than nvidias in the long run.
with open specs, open drivers, and hundreds of thousands of eyes looking at the code, they will get fixed and working much quicker.... well, if you're on linux Tongue but it will roll over to windows too.

Ive heard nothing of AMD (or nvidia) planning to open up their proprietary drivers. Even so, much as I am a OSS fan, creating good 3D video drivers is no easy task and requires in depth knowledge of the underlying hardware. I wouldnt expect miracles from opensource here. Just look at intel GPU drivers; they are opensource, and have been for ages, but they still utterly and completely suck. Lets not mention VIA Chrome drivers.  Love m or hate m, nVidia is head and shoulders above the competition when it comes to Linux drivers.



Nesetalis
Sr. Member
****
Offline Offline

Activity: 420
Merit: 250



View Profile
October 17, 2011, 01:15:18 PM
 #67

I suppose, but then again, intel it self, in the GPU department was completely terrible until sandybridge, and still don't match up with dedicated components, and BARELY break even against AMD's APU shit.

ZOMG Moo!
P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 17, 2011, 01:35:49 PM
 #68

Kind of my point. If intel, with all their might and even with the help of the OSS community cant make half baked linux drivers (not sure why you exclude sandy bridge btw, as thats a complete trainwreck on linux) for their relatively simple hardware, I wouldnt hold my breath for the OSS community to outengineer nVidia in this regard, particularly not without full unrestricted access to all the specs, and having those specs years before release of hardware like internal driver teams of AMD and nvidia have.

Now I do agree over the past years, AMD have made remarkable progress, particularly for windows gaming drivers, but the gap with nvidia is still huge on linux (and with nvidia's new focus on tegra and linux based android, I dont expect AMD to close that gap anytime soon).

Anyway, for me its incredibly simple; for bitcoin mining obviously there is only choice. For windows gaming, either is good, with AMD generally having a price/$ advantage. For Linux and most professional apps, nVidia is the obvious choice.

Nesetalis
Sr. Member
****
Offline Offline

Activity: 420
Merit: 250



View Profile
October 17, 2011, 02:00:48 PM
 #69

I wasn't talking about the driver, i was talking about the hardware.. GMA 3000 is their best yet, but it barely compares to the AMD APUs.

ZOMG Moo!
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 17, 2011, 02:17:07 PM
 #70

Tearing, stutters and screen flickers really.

Yup, same here, sick of the bs. Don't know how it works for nvidia, but ati's powerplay is fucking stupid, that causes screen flickering. Only way to fix/disable that shit, use MSIAfterburner or hack into the ati driver and set something up to disable it. But, gawd lordy, I hope nvidia don't have that shit or nvidia gives end user the choice to easily disable PowerPlay.

There are a whole lot of people that agree on how stupid powerplay is with ati/amd cards.

Btw, sapphire trixx programmer seems not to care to implement to disable powerplay like msiafterburner has.

Just. Right off the bat. (AMD fanboy here) What the Fuck is Powerplay?, And dont tell me to fucking google it, I want YOU to tell me what it does. Because i've never heard of it
Tearing? I'll just assume that your Not talking about screen Vtears. And Well. Cant argue about that. Somegames just Fuckup on certain ATi drivers and it's annoying as hell.
Stutters? Thats the Easy one to fix, Go into the CCC Take off AMD Optimised tessalation Alswell as Surface Format Optimisation. These options are for crappy cards. and cause stuttering on high end ones (im running xfire XFX6870BEdualfan's and i was stuttering like a Whore on Crack before i turn this shit off) Then set the rest to "application controlled"

Screen flickers.... That was a Crossfire bug.. I had that on Crysis2 for a 'lilwhile but then with a driver update it vanished..(dx11HiResAdvanced)
And i could stop the screen flickers but turning on Vsync.

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
d.james
Sr. Member
****
Offline Offline

Activity: 280
Merit: 250

Firstbits: 12pqwk


View Profile
October 17, 2011, 03:48:54 PM
 #71

Before I found out about bitcoin I bought a Nvidia GTX 570, a sweet solid GPU with 3D vision support.

After bitcoin, I traded that 570 for a XFX 5850 + 5870, was a sweet trade at the time Smiley

You can not roll a BitCoin, but you can rollback some. Cheesy
Roll me back: 1NxMkvbYn8o7kKCWPsnWR4FDvH7L9TJqGG
TurboK
Full Member
***
Offline Offline

Activity: 136
Merit: 100



View Profile
October 18, 2011, 12:14:25 AM
 #72

Just. Right off the bat. (AMD fanboy here) What the Fuck is Powerplay?, And dont tell me to fucking google it, I want YOU to tell me what it does. Because i've never heard of it

Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while... and when running at full speed, one or two game may have some low-complexity scene, that takes less power to render, and so the card switches back to idle mode mid-game, and only switches back after some stuttering again.
This behavior may be optimized per-game though, I've only seen it happen in some emulators which don't exactly get support from the driver team. And in windowed mode too.

However, it's still better than how nvidia cards idle at 60c and burn the fuck down if the cooling fan doesn't run for a moment. And no, Radeons don't burn if the fan gets stopped either. When I was testing my 5850 with a passive Accelero heatsink (no fan), the card hit 130c and then instantly halved its own speed so temps can drop and the thing doesn't melt itself on the spot.

Now, I've been owning Ati cards for a long time now, and I agree that the drivers have several retarded issues. But to say that nvidia has better drivers, that's just nvidia payed fanboy ranting nowadays. And a lot of the issues come out of the fact that the typical gamer has an average of 96 processes going on his PC at the same time.

12zJNWtM2HknS2EPLkT9QPSuSq1576aKx7

Tradehill viral bullshit code: TH-R114411
bluefirecorp
Legendary
*
Offline Offline

Activity: 882
Merit: 1000


View Profile
October 18, 2011, 12:29:26 AM
 #73

lol@this thread.

Okay so someone honestly didn't know or they are trolling, but why say "How come because nvidisa is SO MUCH BETTER"? Poppycock.

I have used NIVIDIA GeForce 240s here in Korea to mine and they get up to about 30Mh/s with practical no additional power. That is their plus. If you have a botnet, Geforce is totally practical and cost effective.

If you're condensing and want a single rig, Nvidia is retarded.

Wait a second, a botnet? Are you sure you are using that term correctly?

P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 18, 2011, 06:23:12 AM
 #74


Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while...

Thats not the worst part. Attach a second monitor and see what happens!

seljo
Legendary
*
Offline Offline

Activity: 1178
Merit: 1014


Hodling since 2011.®


View Profile
October 18, 2011, 06:56:42 AM
 #75

Go NVIDIA make that opencl fly I dare you! Smiley

Hodling since 2011.®
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 18, 2011, 08:43:58 AM
 #76


Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while...

Thats not the worst part. Attach a second monitor and see what happens!

I lold when I read this. If anyone has experienced this, they'll lol too. So true... So annoying... Nothing worse than watch your 2nd monitor jump like a fucking rabbit & with tear lines in the screen. Common sense, ATI/AMD, powerplay is broke as a bitch, I guess it never came to the minds of ati/amd to do testing before releasing technology. *Insert Facepalm Here*
P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 18, 2011, 08:56:25 AM
 #77

nVidia disables their variant of powerplay when you attach a second monitor. People bitch about high idle temps with two monitors. I guess rightly so, but it sure beats the unbearable screen tearing you have on AMD and the incredible hoops you have to jump through to try and disable powerplay. In the end I gave up and just used MSI afterburner to fix the clocks and have dual monitor be useful. Kinda ironic how AMD markets their cards for 6 way eyefinty but cant seem to make 2 monitors work.

shakaru
Sr. Member
****
Offline Offline

Activity: 406
Merit: 250


QUIFAS EXCHANGE


View Profile
October 18, 2011, 09:15:36 AM
 #78


Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while...

Thats not the worst part. Attach a second monitor and see what happens!

I lold when I read this. If anyone has experienced this, they'll lol too. So true... So annoying... Nothing worse than watch your 2nd monitor jump like a fucking rabbit & with tear lines in the screen. Common sense, ATI/AMD, powerplay is broke as a bitch, I guess it never came to the minds of ati/amd to do testing before releasing technology. *Insert Facepalm Here*

I actually found a way to deal with this due to mining. I had this issue on some of the lower end cards starting at 5450 - 5830. I found that if I ran cg miner with the cards disabled for mining, but set the clocks before hand, I could keep it from switching. This seem to stop after ver 2.0.3

                             ▄▄▄████████▄▄▄
                         ▄▄██████████████████▄▄
                       ▄███████▄▄▄▄▄▄▄▄▄▄███████▄
                     ▄█████▄▄██████████████▄▄█████▄
        ██████  █████████▄████████████████████▄█████
        ██████  ███████▄████████▄▄▄▄▄▄▄▄████████▄████
                      ▄██████▀████████████▀██████▄████
███████   █████████████████████████████████████████████
███████   █████████████████████████████████████████████
                   ████████████████████████████████████
     ██████████████████████████████████████████████████
     ██████████████████████████████████████████████████
                     █████████████████████████████████
            ██████████▀██████▄████████████▄██████▀████
            ███████████▀████████▀▀▀▀▀▀▀▀▀▀███████▄███
                    █████▀████████████████▄▀██████▄
                     ▀█████▀▀██████████████▀██▀██████▄
                       ▀███████▀▀▀▀▀▀▀▀▀▀███████▀▀▀▀▀▀
                         ▀▀██████████████████▀▀
                             ▀▀▀████████▀▀▀
QUIFAS                    
                    ███
 █              ███ ███
 █              ███  █
███          █  ███
███         ███  █
███  █      ███  █
    ███  █  ███  █
    ███ ███  █   █
     █   █   █
     █      
bronan
Hero Member
*****
Offline Offline

Activity: 774
Merit: 500


Lazy Lurker Reads Alot


View Profile
October 18, 2011, 11:21:40 AM
 #79

Well lol i can't resist answer again i found when i was gaming that all the games who start with made for NVIDIA do have a problem with powerplay, i wonder if any other game which i have not played does it.
So far all had that crap green logo and ofcourse i have not played all games and never will but, and yes powerplay can be addressed in by using the bios editor and turn it completely off.
Now to be honest i do not think you like that if you do not use it on a dedicated miner
For all those who like me do more things then mining on their pc swtiching to lower power consumption does lower the huge bill
and we like that even though it can be a pain in the ass XD
Yes the solutions showed by JackRabitt worked wonders for me too, i actually still use some of them if needed.
I would like to see those companies release the driver to open source because i know there are a lot of wizards who are much better then the ones working at those companies.
Remember omega drivers ... not then your really not from this world those where awesome
Many of these guys made failing drivers from either company work like they should
Sadly they all stopped, most because they lost their jobs or completely disappeared, that is the issue with open source but i am certain people would come back in when they could get some donations from the people using it.
So for now you are stuck with the programmers from ati and nvidia who need alot of time to fix some issues like the crossfire problem which took ages xD. Now i do not dare to say they suxx but lol sometimes when a fix is done in previous version in the next you get it back, and yes on both brands
I still say AMD has to invest more into driver programmers because it will pay off >.<

n4l3hp
Full Member
***
Offline Offline

Activity: 173
Merit: 100


View Profile
October 18, 2011, 02:53:03 PM
 #80

lol
my 2 cents on this is easy

NIVIDIA suxx big time i have had the most dying cards from nvidia, the worst drivers, and is the biggest scammers with endless rebranding the same product.
Now even today after 4 months of use another card died and again a nvidia crap. So the score this far NVIDIA 5 out of 8 died, dead, kaput, gone
ATI only 1 really died out of 27 true 1 other has been replaced but it was still working even with 110 c temps.
and yes when you use these cards when overclocking in time they will slow down, but then again you wanted to overclock and in most cases they will not die completely.
This far all the cards except the dead one, are still working but not overclocked by family and friends and all are happy with my old cards.
Yes ati needs to put some more money on the driver design which in my view will pay off big time, but i do favor any ati above all nvidia only on the low budget cards i say it does not matter which you buy.

Sure nvidia works on a few games better on their product but YOU PEOPLE must understand those games are totally made for these cards and the makers make sure ati will never run better then the paying big time scammer nvidia.
Yes nvidia pays them a lot of money for keep their product fastest, in products where no cards are favored by the secret donations ( or whatever you wanna call the payements made by nvidia )  you see a totally different score.
Now yes some games will benefit from one or the other but to call ATI crap is way too stupid the parts ati uses are way better quality as nvidia is doing, hence the nice cheap capacitors who blew up. ATI has been using the best japanese ones as far as i know. And again has a way lower dying rate

So to end this discussion NVIDIA sells crap period.


+1

Before I got into BOINC and then Bitcoin, I didn't care what card I bought as long as its readily available at my local computer store and I can afford it. Over the years, I cant count how many I've bought and sold. Both computers of my two sons used to have NVIDIA cards for gaming, while I used ATI/AMD cards on my personal rigs that were running BOINC and now mining BTC.

Guess what, all NVIDIA cards died (only used for gaming on stock settings) while my 3850's and 4850's are still alive and crunching BOINC (all OC'ed and been running 24/7 for a few years) and the 6870's are mining BTC without hiccups.

Same also for the motherboards, all that had nvidia chipsets died usually a few months after the warranty expired (used to operate an internet cafe business until last year) except for my old trusty Epox nForce 4 Ultra (with a dual core socket 939 athlon 64) that found a new home inside my wife's computer and a 5670 attached to it.
bronan
Hero Member
*****
Offline Offline

Activity: 774
Merit: 500


Lazy Lurker Reads Alot


View Profile
October 18, 2011, 04:10:38 PM
 #81

Exactly n4l3hp
I have been running boinc and boinc like projects for more then a decade and still some ati run without any issue
Funny enough i have also one DFI nfi 4 expert sli which still is alive with a opteron 175 on it which is only used to run on milkway before i had enough of boinc for now. I think i sponsored them a pretty house by now so time to move on.

I must say the intel boards and compaqs are pretty sturdy as well, a few still running with an p2 and p3 on it in some dark corners of a factory hall and doing their work despite tons of dust piled up in it.
I have cleaned them a couple of times but lol those beasts keep running without a hitch Cheesy
That makes me wonder if the old ati cards will do the same Cheesy, the oldest is a ati radeon ve with 16 Mb which is still being used by a game junk for running older games which do no longer run on these super fast new cards

 
   
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 18, 2011, 06:01:00 PM
 #82

Okay seriously, It's not an "incredible hoop" to do this to fix "powerplay":
Run
Regedit
Ctrl+F->ULPS
Set all Enable_ulps(1)'s to 0's

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
October 18, 2011, 06:04:37 PM
 #83

Okay seriously, It's not an "incredible hoop" to do this to fix "powerplay":
Run
Regedit
Ctrl+F->ULPS
Set all Enable_ulps(1)'s to 0's

However that merely disables low voltage mode.  Which is fine for 24/7 mining rigs but the entire point of ULPS was to reduce heat, noise, and power consumed when card is at idle.  If the solution is to turn it off that doesn't reflect well on AMD.
Transisto
Donator
Legendary
*
Offline Offline

Activity: 1731
Merit: 1008



View Profile WWW
October 18, 2011, 06:10:58 PM
 #84

Please OP, change the title to

"NVIDIA Vs. ATI fanboy arena."
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 18, 2011, 06:24:44 PM
 #85

Okay seriously, It's not an "incredible hoop" to do this to fix "powerplay":
Run
Regedit
Ctrl+F->ULPS
Set all Enable_ulps(1)'s to 0's

However that merely disables low voltage mode.  Which is fine for 24/7 mining rigs but the entire point of ULPS was to reduce heat, noise, and power consumed when card is at idle.  If the solution is to turn it off that doesn't reflect well on AMD.

No? My cards Never change clocks after i've changed this(of course unless i tell them to) And i never play with my voltages anyways

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 19, 2011, 02:03:26 AM
 #86

Okay seriously, It's not an "incredible hoop" to do this to fix "powerplay":
Run
Regedit
Ctrl+F->ULPS
Set all Enable_ulps(1)'s to 0's

I think ULPS and powerplay are two different things.

This post is from a while ago that I posted.
http://www.techpowerup.com/forums/showpost.php?s=b44704028293c434a2409d528e7af1db&p=2416625&postcount=58

Hard route to disable PP.
http://www.techpowerup.com/forums/showthread.php?s=b44704028293c434a2409d528e7af1db&t=117633
TurboK
Full Member
***
Offline Offline

Activity: 136
Merit: 100



View Profile
October 19, 2011, 05:09:25 AM
 #87


Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while...

Thats not the worst part. Attach a second monitor and see what happens!

I often have two monitors, well, a monitor and a HDTV connected at the same time. Last month I had my dad checking car ads for 1-2 hours while I was playing Gradius V on PCSX2 on the other screen. Absolutely no problems experienced whatsoever, on a radeon 6950.

nVidia disables their variant of powerplay when you attach a second monitor. People bitch about high idle temps with two monitors. I guess rightly so, but it sure beats the unbearable screen tearing you have on AMD and the incredible hoops you have to jump through to try and disable powerplay. In the end I gave up and just used MSI afterburner to fix the clocks and have dual monitor be useful. Kinda ironic how AMD markets their cards for 6 way eyefinty but cant seem to make 2 monitors work.

AMD cards run at higher clocks in dual monitor mode as well.

Also, on my end the tearing is caused by the screens having different refresh rates and the card only syncing one of them. I get a lot of tearing too on one monitor when I have both screens on. I don't give a crap honestly cause I rarely use two monitors the same time, there's barely any point in it. I pretty much only do it when there are others using my computer at the same time.

12zJNWtM2HknS2EPLkT9QPSuSq1576aKx7

Tradehill viral bullshit code: TH-R114411
P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 19, 2011, 06:35:43 AM
 #88

. Absolutely no problems experienced whatsoever, on a radeon 6950.

....

Also, on my end the tearing is caused by the screens having different refresh rates and the card only syncing one of them. I get a lot of tearing too on one monitor when I have both screens on.

Right. "No problems whatsoever" expect for horrific tearing. Which btw, isnt caused by the refresh rates. Fix your clocks and the tearing will go away.

Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 20, 2011, 02:23:20 PM
 #89

. Absolutely no problems experienced whatsoever, on a radeon 6950.

....

Also, on my end the tearing is caused by the screens having different refresh rates and the card only syncing one of them. I get a lot of tearing too on one monitor when I have both screens on.

Right. "No problems whatsoever" expect for horrific tearing. Which btw, isnt caused by the refresh rates. Fix your clocks and the tearing will go away.

OWNED +0

Someone wanna show me what these "tears" are, I Know were not talking about Vsync tears here...
Are we talking about Skewed/Streching textures? Photos/Video please.


http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 20, 2011, 02:29:48 PM
 #90

Someone wanna show me what these "tears" are, I Know were not talking about Vsync tears here...
Are we talking about Skewed/Streching textures? Photos/Video please.

My uber pwner mspaint skillz at work, :p. LOLz.
When the crap happens, where the orange box is, that forms with lines then goes away. The entire screen jumps madly. Of course on the actual screen, it ain't orange. It's about the same spot I get hit with it on my 2nd monitor.
TurboK
Full Member
***
Offline Offline

Activity: 136
Merit: 100



View Profile
October 20, 2011, 05:44:22 PM
 #91

Well, I never got tears like that. Only semi-broken vsync.

12zJNWtM2HknS2EPLkT9QPSuSq1576aKx7

Tradehill viral bullshit code: TH-R114411
P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 20, 2011, 06:59:02 PM
 #92

Someone wanna show me what these "tears" are, I Know were not talking about Vsync tears here...
Are we talking about Skewed/Streching textures? Photos/Video please.

Here is an example:
http://www.youtube.com/watch?v=uYExKJB3K84

pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 20, 2011, 09:24:36 PM
 #93

Someone wanna show me what these "tears" are, I Know were not talking about Vsync tears here...
Are we talking about Skewed/Streching textures? Photos/Video please.

Here is an example:
http://www.youtube.com/watch?v=uYExKJB3K84

Yup, exactly that...
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 20, 2011, 09:49:03 PM
 #94

Someone wanna show me what these "tears" are, I Know were not talking about Vsync tears here...
Are we talking about Skewed/Streching textures? Photos/Video please.

Here is an example:
http://www.youtube.com/watch?v=uYExKJB3K84

Yup, exactly that...
EWWWW i've never seen that before in my life, and i went and plugged two monitors into my dual 6870's too see if it would happen...
Annnnnnnnnnd.
No. it did not. Qua?
So i went to eyefinity to see if it would exsist there, annnnnnnd. No. It'was not..
Perhaps it's due to me using two videocards in crossfire?

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
P4man
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
October 20, 2011, 09:52:42 PM
 #95

Could be because of crossfire. Do your cards clock all the way down to 150/300 MHz or thereabouts? The problem occurs because the cards clock up and down from their lowest speed to something barely faster, which it does when scrolling and stuff. The solution is fixing the clocks at a higher rate so powerplay doesnt adjust frequencies and voltages every time you scroll in a webpage, but CF may have done that for you.

pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 20, 2011, 09:59:24 PM
 #96

The solution is fixing the clocks at a higher rate so powerplay doesnt adjust frequencies and voltages every time you scroll in a webpage

That's the problem for me. 1040core & 300mem on card one, and 1015core & 600mem card two. Trixx cannot disable PP, MSIAfterburner can whether clocks are set high or not. Using trixx cuz msi afterburner cannot control voltages for my brand cards. I've recently swapped to trixx as it is cooler out, so upping voltage for better stability @ 1040core.

Edit:
Trying to keep memory at lowest for less power consumption.
RobertRibbeck
Full Member
***
Offline Offline

Activity: 221
Merit: 100


View Profile
October 20, 2011, 11:04:29 PM
 #97

Ya right

anyone care to explain the crash of this GOOD/beter nvida when closing gpu crunching

they might be better for  gamers but for miming they SUCK


Please "Clear your browser cookies" then use http://bitcoinpyramid.com/r/3360 to Join BitCoin Pyramid
  use my referral & I'll refund a % of your first deposit back to your account
  Deposit .5 BTC or more and I'll give back 50% of what I receive

First Deposit of 1 BTC will get 75% of what I get back
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 21, 2011, 03:16:55 AM
 #98

Ya right

anyone care to explain the crash of this GOOD/beter nvida when closing gpu crunching

they might be better for  gamers but for miming they SUCK



Go do your homework.
https://en.bitcoin.it/wiki/Mining_hardware_comparison
Heres a hint, Sp stands for Stream Processing Units

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
Ashkelon
Full Member
***
Offline Offline

Activity: 188
Merit: 100


View Profile
October 22, 2011, 05:39:51 PM
 #99


Out of all the time I've heard of counter strike, I've actually never played it. Is it really good?

Heretic! Burn him!

CS is great, although at this stage if you try playing you'll be going up against guys who have reflexes slightly faster than their own CPUs and you'll mostly be dead.


EDIT Going to play CSS now

dominus mysteria
Fiyasko
Legendary
*
Offline Offline

Activity: 1428
Merit: 1001


Okey Dokey Lokey


View Profile
October 22, 2011, 05:53:00 PM
 #100


Out of all the time I've heard of counter strike, I've actually never played it. Is it really good?

Heretic! Burn him!

CS is great, although at this stage if you try playing you'll be going up against guys who have reflexes slightly faster than their own CPUs and you'll mostly be dead.


EDIT Going to play CSS now
+1
If you didnt play when it was the #1onlineFPS, Then your going to get killed, Alot, And you will be calling everyone a hacker.
Seriously.
Unless ofcourse you dont pay for the game and all you fight is noobs who dont know what a "scrim" is when you "come scrim me you pansy"

I score 46's on average,http://mindbluff.com/reaction.htm and i've got One Thousand One Hundred Fifty Seven Hours, Of gameplay on cs:s
If you cant score 35+ as your average, Dont even Bother playing cs:s

Also, I score 230's avrg on http://www.humanbenchmark.com/tests/reactiontime/index.php
We kick anyone with a latency above 180ms

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
pekv2
Hero Member
*****
Offline Offline

Activity: 770
Merit: 502



View Profile
October 22, 2011, 06:10:34 PM
 #101

Found this CS video, looks radical.

http://www.youtube.com/watch?v=BLCWYRFQim0

On the real though, looked at game play of cs, reminds me of RTCW. I might look into it, hopefully stand alone disk being sold some where.

My reaction is top notch with gaming, I'll be fine with CS.

The time I've got documented for gaming is 3018 hours, this is not including the undocumented time.

Ashkelon
Full Member
***
Offline Offline

Activity: 188
Merit: 100


View Profile
October 22, 2011, 06:46:49 PM
 #102


yup, that's CS alright

dominus mysteria
shakaru
Sr. Member
****
Offline Offline

Activity: 406
Merit: 250


QUIFAS EXCHANGE


View Profile
October 23, 2011, 08:21:25 AM
 #103


Out of all the time I've heard of counter strike, I've actually never played it. Is it really good?

Heretic! Burn him!

CS is great, although at this stage if you try playing you'll be going up against guys who have reflexes slightly faster than their own CPUs and you'll mostly be dead.


EDIT Going to play CSS now
+1
If you didnt play when it was the #1onlineFPS, Then your going to get killed, Alot, And you will be calling everyone a hacker.
Seriously.
Unless ofcourse you dont pay for the game and all you fight is noobs who dont know what a "scrim" is when you "come scrim me you pansy"

I score 46's on average,http://mindbluff.com/reaction.htm and i've got One Thousand One Hundred Fifty Seven Hours, Of gameplay on cs:s
If you cant score 35+ as your average, Dont even Bother playing cs:s

Also, I score 230's avrg on http://www.humanbenchmark.com/tests/reactiontime/index.php
We kick anyone with a latency above 180ms

268 first time

                             ▄▄▄████████▄▄▄
                         ▄▄██████████████████▄▄
                       ▄███████▄▄▄▄▄▄▄▄▄▄███████▄
                     ▄█████▄▄██████████████▄▄█████▄
        ██████  █████████▄████████████████████▄█████
        ██████  ███████▄████████▄▄▄▄▄▄▄▄████████▄████
                      ▄██████▀████████████▀██████▄████
███████   █████████████████████████████████████████████
███████   █████████████████████████████████████████████
                   ████████████████████████████████████
     ██████████████████████████████████████████████████
     ██████████████████████████████████████████████████
                     █████████████████████████████████
            ██████████▀██████▄████████████▄██████▀████
            ███████████▀████████▀▀▀▀▀▀▀▀▀▀███████▄███
                    █████▀████████████████▄▀██████▄
                     ▀█████▀▀██████████████▀██▀██████▄
                       ▀███████▀▀▀▀▀▀▀▀▀▀███████▀▀▀▀▀▀
                         ▀▀██████████████████▀▀
                             ▀▀▀████████▀▀▀
QUIFAS                    
                    ███
 █              ███ ███
 █              ███  █
███          █  ███
███         ███  █
███  █      ███  █
    ███  █  ███  █
    ███ ███  █   █
     █   █   █
     █      
Pages: 1 2 3 4 5 6 [All]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!