Bitcoin Forum
December 05, 2016, 10:41:45 AM *
News: Latest stable version of Bitcoin Core: 0.13.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: « 1 2 3 [4] 5 6 »  All
  Print  
Author Topic: WHy do people only buy ATIcards when NVIDA is somuch better?  (Read 10109 times)
Fiyasko
Legendary
*
Offline Offline

Activity: 1428


Okey Dokey Lokey


View Profile
October 16, 2011, 09:23:55 PM
 #61

Please, Just Stop.
+1
Sick of seeing this come back up
Just stop.
Last.

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
1480934505
Hero Member
*
Offline Offline

Posts: 1480934505

View Profile Personal Message (Offline)

Ignore
1480934505
Reply with quote  #2

1480934505
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1480934505
Hero Member
*
Offline Offline

Posts: 1480934505

View Profile Personal Message (Offline)

Ignore
1480934505
Reply with quote  #2

1480934505
Report to moderator
1480934505
Hero Member
*
Offline Offline

Posts: 1480934505

View Profile Personal Message (Offline)

Ignore
1480934505
Reply with quote  #2

1480934505
Report to moderator
1480934505
Hero Member
*
Offline Offline

Posts: 1480934505

View Profile Personal Message (Offline)

Ignore
1480934505
Reply with quote  #2

1480934505
Report to moderator
Sargasm
Member
**
Offline Offline

Activity: 112


View Profile
October 17, 2011, 03:03:59 AM
 #62

I think this thread is pretty funny.

Nvidia cards tend to be a lot prettier for gaming.  I have had both tri fire 5970+5870 and now quadfire 5970s and I'd definitely give the pure smooth sexiness award to nvidia.  Fucking tearing, ATI, wtf. Well... Tearing, stutters and screen flickers really.

If I weren't making money with my cards, they'd be kinda dumb.

ALTHOUGH as a caveat... The 69xx series by ATI is a far the smoother renderer for games.  Competitive with nvidia even.
pekv2
Hero Member
*****
Offline Offline

Activity: 770



View Profile
October 17, 2011, 04:10:27 AM
 #63

Tearing, stutters and screen flickers really.

Yup, same here, sick of the bs. Don't know how it works for nvidia, but ati's powerplay is fucking stupid, that causes screen flickering. Only way to fix/disable that shit, use MSIAfterburner or hack into the ati driver and set something up to disable it. But, gawd lordy, I hope nvidia don't have that shit or nvidia gives end user the choice to easily disable PowerPlay.

There are a whole lot of people that agree on how stupid powerplay is with ati/amd cards.

Btw, sapphire trixx programmer seems not to care to implement to disable powerplay like msiafterburner has.
Sargasm
Member
**
Offline Offline

Activity: 112


View Profile
October 17, 2011, 04:16:27 AM
 #64

The idea is novel, but implementation is horse shit.

ATI is working from behind though (and doing well on the whole) I got a 5790 for 310 off ebay thats still WELL worth the price.  Nvidias OCD style attention to detail got a shitload of my money for years.  AMD is doing a half decent job of catching up, but Nvidia (much like Intel recently) has done a spectacular job of keeping their rendering on screen limited to that which is smooth rather than solely that which is fast.
Nesetalis
Sr. Member
****
Offline Offline

Activity: 420



View Profile
October 17, 2011, 07:51:47 AM
 #65

since ATI was eaten by AMD.. AMD has released and opensourced much of the drivers.. and look like they are moving toward releasing them all. This will make AMD's drivers FAR better than nvidias in the long run.
with open specs, open drivers, and hundreds of thousands of eyes looking at the code, they will get fixed and working much quicker.... well, if you're on linux Tongue but it will roll over to windows too.

ZOMG Moo!
P4man
Hero Member
*****
Offline Offline

Activity: 504



View Profile
October 17, 2011, 08:08:43 AM
 #66

since ATI was eaten by AMD.. AMD has released and opensourced much of the drivers..

No they havent. They have released partial specs for older cards so the community has been able to build usable drivers. Not great drivers, but usable. Well, if you dont game that is.

Quote
and look like they are moving toward releasing them all. This will make AMD's drivers FAR better than nvidias in the long run.
with open specs, open drivers, and hundreds of thousands of eyes looking at the code, they will get fixed and working much quicker.... well, if you're on linux Tongue but it will roll over to windows too.

Ive heard nothing of AMD (or nvidia) planning to open up their proprietary drivers. Even so, much as I am a OSS fan, creating good 3D video drivers is no easy task and requires in depth knowledge of the underlying hardware. I wouldnt expect miracles from opensource here. Just look at intel GPU drivers; they are opensource, and have been for ages, but they still utterly and completely suck. Lets not mention VIA Chrome drivers.  Love m or hate m, nVidia is head and shoulders above the competition when it comes to Linux drivers.



Nesetalis
Sr. Member
****
Offline Offline

Activity: 420



View Profile
October 17, 2011, 01:15:18 PM
 #67

I suppose, but then again, intel it self, in the GPU department was completely terrible until sandybridge, and still don't match up with dedicated components, and BARELY break even against AMD's APU shit.

ZOMG Moo!
P4man
Hero Member
*****
Offline Offline

Activity: 504



View Profile
October 17, 2011, 01:35:49 PM
 #68

Kind of my point. If intel, with all their might and even with the help of the OSS community cant make half baked linux drivers (not sure why you exclude sandy bridge btw, as thats a complete trainwreck on linux) for their relatively simple hardware, I wouldnt hold my breath for the OSS community to outengineer nVidia in this regard, particularly not without full unrestricted access to all the specs, and having those specs years before release of hardware like internal driver teams of AMD and nvidia have.

Now I do agree over the past years, AMD have made remarkable progress, particularly for windows gaming drivers, but the gap with nvidia is still huge on linux (and with nvidia's new focus on tegra and linux based android, I dont expect AMD to close that gap anytime soon).

Anyway, for me its incredibly simple; for bitcoin mining obviously there is only choice. For windows gaming, either is good, with AMD generally having a price/$ advantage. For Linux and most professional apps, nVidia is the obvious choice.

Nesetalis
Sr. Member
****
Offline Offline

Activity: 420



View Profile
October 17, 2011, 02:00:48 PM
 #69

I wasn't talking about the driver, i was talking about the hardware.. GMA 3000 is their best yet, but it barely compares to the AMD APUs.

ZOMG Moo!
Fiyasko
Legendary
*
Offline Offline

Activity: 1428


Okey Dokey Lokey


View Profile
October 17, 2011, 02:17:07 PM
 #70

Tearing, stutters and screen flickers really.

Yup, same here, sick of the bs. Don't know how it works for nvidia, but ati's powerplay is fucking stupid, that causes screen flickering. Only way to fix/disable that shit, use MSIAfterburner or hack into the ati driver and set something up to disable it. But, gawd lordy, I hope nvidia don't have that shit or nvidia gives end user the choice to easily disable PowerPlay.

There are a whole lot of people that agree on how stupid powerplay is with ati/amd cards.

Btw, sapphire trixx programmer seems not to care to implement to disable powerplay like msiafterburner has.

Just. Right off the bat. (AMD fanboy here) What the Fuck is Powerplay?, And dont tell me to fucking google it, I want YOU to tell me what it does. Because i've never heard of it
Tearing? I'll just assume that your Not talking about screen Vtears. And Well. Cant argue about that. Somegames just Fuckup on certain ATi drivers and it's annoying as hell.
Stutters? Thats the Easy one to fix, Go into the CCC Take off AMD Optimised tessalation Alswell as Surface Format Optimisation. These options are for crappy cards. and cause stuttering on high end ones (im running xfire XFX6870BEdualfan's and i was stuttering like a Whore on Crack before i turn this shit off) Then set the rest to "application controlled"

Screen flickers.... That was a Crossfire bug.. I had that on Crysis2 for a 'lilwhile but then with a driver update it vanished..(dx11HiResAdvanced)
And i could stop the screen flickers but turning on Vsync.

http://bitcoin-otc.com/viewratingdetail.php?nick=DingoRabiit&sign=ANY&type=RECV <-My Ratings
https://bitcointalk.org/index.php?topic=857670.0 GAWminers and associated things are not to be trusted, Especially the "mineral" exchange
d.james
Sr. Member
****
Offline Offline

Activity: 280

Firstbits: 12pqwk


View Profile
October 17, 2011, 03:48:54 PM
 #71

Before I found out about bitcoin I bought a Nvidia GTX 570, a sweet solid GPU with 3D vision support.

After bitcoin, I traded that 570 for a XFX 5850 + 5870, was a sweet trade at the time Smiley

You can not roll a BitCoin, but you can rollback some. Cheesy
Roll me back: 1NxMkvbYn8o7kKCWPsnWR4FDvH7L9TJqGG
TurboK
Full Member
***
Offline Offline

Activity: 137



View Profile
October 18, 2011, 12:14:25 AM
 #72

Just. Right off the bat. (AMD fanboy here) What the Fuck is Powerplay?, And dont tell me to fucking google it, I want YOU to tell me what it does. Because i've never heard of it

Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while... and when running at full speed, one or two game may have some low-complexity scene, that takes less power to render, and so the card switches back to idle mode mid-game, and only switches back after some stuttering again.
This behavior may be optimized per-game though, I've only seen it happen in some emulators which don't exactly get support from the driver team. And in windowed mode too.

However, it's still better than how nvidia cards idle at 60c and burn the fuck down if the cooling fan doesn't run for a moment. And no, Radeons don't burn if the fan gets stopped either. When I was testing my 5850 with a passive Accelero heatsink (no fan), the card hit 130c and then instantly halved its own speed so temps can drop and the thing doesn't melt itself on the spot.

Now, I've been owning Ati cards for a long time now, and I agree that the drivers have several retarded issues. But to say that nvidia has better drivers, that's just nvidia payed fanboy ranting nowadays. And a lot of the issues come out of the fact that the typical gamer has an average of 96 processes going on his PC at the same time.

12zJNWtM2HknS2EPLkT9QPSuSq1576aKx7

Tradehill viral bullshit code: TH-R114411
bluefirecorp
Hero Member
*****
Offline Offline

Activity: 686


View Profile
October 18, 2011, 12:29:26 AM
 #73

lol@this thread.

Okay so someone honestly didn't know or they are trolling, but why say "How come because nvidisa is SO MUCH BETTER"? Poppycock.

I have used NIVIDIA GeForce 240s here in Korea to mine and they get up to about 30Mh/s with practical no additional power. That is their plus. If you have a botnet, Geforce is totally practical and cost effective.

If you're condensing and want a single rig, Nvidia is retarded.

Wait a second, a botnet? Are you sure you are using that term correctly?

.BitDice.               ▄▄███▄▄
           ▄▄██▀▀ ▄ ▀▀██▄▄
      ▄▄█ ▀▀  ▄▄█████▄▄  ▀▀ █▄▄
  ▄▄██▀▀     ▀▀ █████ ▀▀     ▀▀██▄▄
██▀▀ ▄▄██▀      ▀███▀      ▀██▄▄ ▀▀██
██  ████▄▄       ███       ▄▄████  ██
██  █▀▀████▄▄  ▄█████▄  ▄▄████▀▀█  ██
██  ▀     ▀▀▀███████████▀▀▀     ▀  ██
             ███████████
██  ▄     ▄▄▄███████████▄▄▄     ▄  ██
██  █▄▄████▀▀  ▀█████▀  ▀▀████▄▄█  ██
██  ████▀▀       ███       ▀▀████  ██
██▄▄ ▀▀██▄      ▄███▄      ▄██▀▀ ▄▄██
  ▀▀██▄▄     ▄▄ █████ ▄▄     ▄▄██▀▀
      ▀▀█ ▄▄  ▀▀█████▀▀  ▄▄ █▀▀
           ▀▀██▄▄ ▀ ▄▄██▀▀
               ▀▀███▀▀
        ▄▄███████▄▄
     ▄███████████████▄
    ████▀▀       ▀▀████
   ████▀           ▀████
   ████             ████
   ████ ▄▄▄▄▄▄▄▄▄▄▄ ████
▄█████████████████████████▄
██████████▀▀▀▀▀▀▀██████████
████                   ████
████                   ████
████                   ████
████                   ████
████                   ████
████▄                 ▄████
████████▄▄▄     ▄▄▄████████
  ▀▀▀█████████████████▀▀▀
        ▀▀▀█████▀▀▀
▄▄████████████████████████████████▄▄
██████████████████████████████████████
█████                            █████
█████                            █████
█████                            █████
█████                            █████
█████                     ▄▄▄▄▄▄▄▄▄▄
█████                   ▄█▀▀▀▀▀▀▀▀▀▀█▄
█████                   ██          ██
█████                   ██          ██
█████                   ██          ██
██████████████████▀▀███ ██          ██
 ████████████████▄  ▄██ ██          ██
   ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀ ██          ██
             ██████████ ██          ██
           ▄███████████ ██████▀▀██████
          █████████████  ▀████▄▄████▀
[/]
P4man
Hero Member
*****
Offline Offline

Activity: 504



View Profile
October 18, 2011, 06:23:12 AM
 #74


Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while...

Thats not the worst part. Attach a second monitor and see what happens!

seljo
Legendary
*
Offline Offline

Activity: 1077


Hodling.


View Profile
October 18, 2011, 06:56:42 AM
 #75

Go NVIDIA make that opencl fly I dare you! Smiley

Hodling since 2011.®
pekv2
Hero Member
*****
Offline Offline

Activity: 770



View Profile
October 18, 2011, 08:43:58 AM
 #76


Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while...

Thats not the worst part. Attach a second monitor and see what happens!

I lold when I read this. If anyone has experienced this, they'll lol too. So true... So annoying... Nothing worse than watch your 2nd monitor jump like a fucking rabbit & with tear lines in the screen. Common sense, ATI/AMD, powerplay is broke as a bitch, I guess it never came to the minds of ati/amd to do testing before releasing technology. *Insert Facepalm Here*
P4man
Hero Member
*****
Offline Offline

Activity: 504



View Profile
October 18, 2011, 08:56:25 AM
 #77

nVidia disables their variant of powerplay when you attach a second monitor. People bitch about high idle temps with two monitors. I guess rightly so, but it sure beats the unbearable screen tearing you have on AMD and the incredible hoops you have to jump through to try and disable powerplay. In the end I gave up and just used MSI afterburner to fix the clocks and have dual monitor be useful. Kinda ironic how AMD markets their cards for 6 way eyefinty but cant seem to make 2 monitors work.

shakaru
Sr. Member
****
Offline Offline

Activity: 364


View Profile WWW
October 18, 2011, 09:15:36 AM
 #78


Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while...

Thats not the worst part. Attach a second monitor and see what happens!

I lold when I read this. If anyone has experienced this, they'll lol too. So true... So annoying... Nothing worse than watch your 2nd monitor jump like a fucking rabbit & with tear lines in the screen. Common sense, ATI/AMD, powerplay is broke as a bitch, I guess it never came to the minds of ati/amd to do testing before releasing technology. *Insert Facepalm Here*

I actually found a way to deal with this due to mining. I had this issue on some of the lower end cards starting at 5450 - 5830. I found that if I ran cg miner with the cards disabled for mining, but set the clocks before hand, I could keep it from switching. This seem to stop after ver 2.0.3

bronan
Hero Member
*****
Offline Offline

Activity: 765


Lazy Lurker Reads Alot


View Profile WWW
October 18, 2011, 11:21:40 AM
 #79

Well lol i can't resist answer again i found when i was gaming that all the games who start with made for NVIDIA do have a problem with powerplay, i wonder if any other game which i have not played does it.
So far all had that crap green logo and ofcourse i have not played all games and never will but, and yes powerplay can be addressed in by using the bios editor and turn it completely off.
Now to be honest i do not think you like that if you do not use it on a dedicated miner
For all those who like me do more things then mining on their pc swtiching to lower power consumption does lower the huge bill
and we like that even though it can be a pain in the ass XD
Yes the solutions showed by JackRabitt worked wonders for me too, i actually still use some of them if needed.
I would like to see those companies release the driver to open source because i know there are a lot of wizards who are much better then the ones working at those companies.
Remember omega drivers ... not then your really not from this world those where awesome
Many of these guys made failing drivers from either company work like they should
Sadly they all stopped, most because they lost their jobs or completely disappeared, that is the issue with open source but i am certain people would come back in when they could get some donations from the people using it.
So for now you are stuck with the programmers from ati and nvidia who need alot of time to fix some issues like the crossfire problem which took ages xD. Now i do not dare to say they suxx but lol sometimes when a fix is done in previous version in the next you get it back, and yes on both brands
I still say AMD has to invest more into driver programmers because it will pay off >.<

n4l3hp
Full Member
***
Offline Offline

Activity: 173


View Profile
October 18, 2011, 02:53:03 PM
 #80

lol
my 2 cents on this is easy

NIVIDIA suxx big time i have had the most dying cards from nvidia, the worst drivers, and is the biggest scammers with endless rebranding the same product.
Now even today after 4 months of use another card died and again a nvidia crap. So the score this far NVIDIA 5 out of 8 died, dead, kaput, gone
ATI only 1 really died out of 27 true 1 other has been replaced but it was still working even with 110 c temps.
and yes when you use these cards when overclocking in time they will slow down, but then again you wanted to overclock and in most cases they will not die completely.
This far all the cards except the dead one, are still working but not overclocked by family and friends and all are happy with my old cards.
Yes ati needs to put some more money on the driver design which in my view will pay off big time, but i do favor any ati above all nvidia only on the low budget cards i say it does not matter which you buy.

Sure nvidia works on a few games better on their product but YOU PEOPLE must understand those games are totally made for these cards and the makers make sure ati will never run better then the paying big time scammer nvidia.
Yes nvidia pays them a lot of money for keep their product fastest, in products where no cards are favored by the secret donations ( or whatever you wanna call the payements made by nvidia )  you see a totally different score.
Now yes some games will benefit from one or the other but to call ATI crap is way too stupid the parts ati uses are way better quality as nvidia is doing, hence the nice cheap capacitors who blew up. ATI has been using the best japanese ones as far as i know. And again has a way lower dying rate

So to end this discussion NVIDIA sells crap period.


+1

Before I got into BOINC and then Bitcoin, I didn't care what card I bought as long as its readily available at my local computer store and I can afford it. Over the years, I cant count how many I've bought and sold. Both computers of my two sons used to have NVIDIA cards for gaming, while I used ATI/AMD cards on my personal rigs that were running BOINC and now mining BTC.

Guess what, all NVIDIA cards died (only used for gaming on stock settings) while my 3850's and 4850's are still alive and crunching BOINC (all OC'ed and been running 24/7 for a few years) and the 6870's are mining BTC without hiccups.

Same also for the motherboards, all that had nvidia chipsets died usually a few months after the warranty expired (used to operate an internet cafe business until last year) except for my old trusty Epox nForce 4 Ultra (with a dual core socket 939 athlon 64) that found a new home inside my wife's computer and a 5670 attached to it.
Pages: « 1 2 3 [4] 5 6 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!