Gabi
Legendary
Offline
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
|
|
October 10, 2011, 01:58:06 PM |
|
You are right. Sorry. But you know, war... war never changes
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 10, 2011, 02:05:36 PM |
|
Ok you made up for it by managing to pull that classic computer game quote into the thread.
|
|
|
|
dark_silverstar
Member
Offline
Activity: 76
Merit: 10
|
|
October 10, 2011, 02:08:19 PM |
|
well because the price are cheap and good for gaming. selling my nvidia card because so overprice with average performance in gaming even mid end card overprice like hell
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 10, 2011, 02:36:06 PM |
|
So a natural follow up to this discussion would be:
What sort of hash algorithm could be relatively platform neutral? (This would be kind of ideal in honesty so hardware platform is not as much of a concern and free market pricing and competition work more smoothly)
OR - What sort of hash algorithm could be pro-Nvidia architecture?
It would be hard I suspect to get Bitcoin to ammend their hash algorithm with something like this BUT if it were to ever happen it might be good to have an idea of some options...
My understanding is that Nvidia underperforms AMD in all current cryptographic hashing algorithms. Hackers and password crackers all over the world use AMD GPU exclusively. Whitepixel for example is an open source MD5 crackers which has vastly superior performance on AMD GPU (roughly 400% higher throughput when normalized for price). There are a couple reasons for this 1) Prior to Fermi GPU Nvidia chips lacked 32bit integers internally. This vastly slows down computations on 32bit numbers. Given 32bit int is an industry standard for CPU architecture I don't know of any cryptographic hash which doesn't use 32 or 64 bit numbers internally. 2) Nvidia GPU lack certain instructions that allow hashing to be completed in less steps. There is no reason Nvidia couldn't add these fast operators in the future but until now cryptographic (and integer performance in general) wasn't an important metric. 3) Nvidia architecture is based on the concept of fewer but more powerful shaders where AMD is based on the concept of more but simpler shaders. AMD architecture simply fits better with concept of hashing where multiple simple operators are performed on an operand. I don't think AMD designed their GPU to be good at cryptography. They simply happened to be more efficient at cryptographic functions than NVidia's GPUs are. My guess is now that the market has seen what performance can be gained by using GPU for integer operations and as GPGPU becomes more common the market will demand Nvidia have better integer performance and better cryptography efficiency. Some futue Nvidia GPU will likely overcome the current shortcoming of the current architecture.
|
|
|
|
Fiyasko
Legendary
Offline
Activity: 1428
Merit: 1001
Okey Dokey Lokey
|
|
October 10, 2011, 04:32:59 PM Last edit: October 10, 2011, 04:52:42 PM by JackRabiit |
|
And btw, i had an ATI even before discovering bitcoin, so uh, nvidia is not better even outside of bitcoin
It blatantly is. Nvidia is best for driver support ( ATI has shittiest drivers ever - 100% CPU bug STILL present now ), gaming performance and general stability and usage as normal GPU. Only get ATI if you want to compute. Avoid otherwise. Is that a fucking troll? Lets see you use OpenCL---OH WAIT YOU FUCKING CANT. Okay how about eye-- SHIT YOU CANT DO THAT EITHER What about 3D gami---- OH YOUR GAMES NEED TO BE PROGRAMMED FOR IT? Okay.. Lets stop bashing you.. Can your $500 GTX580 get >200Mh/sec? NOU? IT CANT? Thats funny, My $120 5830 gets 306mh/sec Sooo lets see here... Nvidia is better how? A Shader clock? AKA How Nvidia is rigging the market by putting All new graphics tech on the "shader" clock. Cool. Can you do Tessalation? NOU? COOLSTORYBRO. Nvidia is Not. IMO, In ANYWAY SHAPE OR FORM, Better Than AMD cards. Heres what Nvidia is doing. SSAO=Shader clock AO=Shader clock Shadows=Shader clock Bumpmapping=Shaderclock Lighting effects=Shaderclock Reflections=Shaderclock AA=Coreclock Textures=Memclock Engine=Coreclock Physics=Core+Mem+Shader Thier prices are 2x what they should be, And all this Bullshit about drivers? Are you FUCKING kidding me. Last time i had a driver issue with ATI, Was went i went crossfire, And i was Fucking 14, And What did i do to solve it? Uninstall Reinstall, OH THATS SUCH A BUG?!!!! LETS FLAME ATI!. If you lived nearby me I would fucking smack you for not knowing what your talking about. They are programming all modern games to use the shader clock, A clock that NO amd card has. Think about it. If AMD WAS ALLOWED (WICH NVIDIA HAS PUT A BLANKET LAWSUIT POLICY OVER DOING THIS) To Put a Shader Clock, Into thier cards, GOODBYE nvidia. Nvidia has label'd Shader Clock's "Technolgical Property of Nvidia", Effectively Banning AMD from using a Shader clock And with Nvidia writing the code for all the new games to use the SHADER CLOCK LIKE A FUCKING WHORE, It makes ATI cards Seem to perform worse. They will perform Terrible Bad on a Nvidia Flagship game. No doubt, No denial. (old) Dual 5770xfire= TWENTY fps on NFSShift, NO MATTER THE GRAHPICS SETTINGS. Max and Nuked gave same frame rates. But Dual 5770's on Dirt2= Max graphics(less the crowd set to low)= Frame Rate NEVER dropped below !50! Now you tell me wich of those video games Graphically looks better. It's All About The Method Of Testing Performance And Heres A Fuckton of methods http://alienbabeltech.com/main/introducing-the-worlds-fastest-graphics-card-amds-flagship-hd-6990/all/1NOT TO MENTION, THAT THE RADEON 6990 IS THE WORLDS FASTEST GPU. ONLY NVIDA FANBOYS WOULD SAY OTHERWISE And before you drabble at me about the Nvidia's calims of they're new card being faster, Thats if you've got TWO of them VS One 6990
|
|
|
|
Meatball
|
|
October 10, 2011, 06:07:04 PM |
|
I don't get this, is it price? I thought people would be investing in NVIDIA 580GTX's rather than ATI.
What gives?
I smell a troll. I don't see why anyone would even post something like this when even a few minutes of reading on the boards/research would clearly show ATI outperforms NVidia in mining. Seriously people, spend a few minutes looking around before you post.
|
|
|
|
wndrbr3d
|
|
October 10, 2011, 06:17:02 PM |
|
I don't get this, is it price? I thought people would be investing in NVIDIA 580GTX's rather than ATI.
What gives?
I smell a troll. I don't see why anyone would even post something like this when even a few minutes of reading on the boards/research would clearly show ATI outperforms NVidia in mining. Seriously people, spend a few minutes looking around before you post. +1
|
|
|
|
pekv2
|
|
October 10, 2011, 06:36:47 PM |
|
Shit man, not any fan here but, I have only 4 or 5 pc games, and one of my favorites I like to play, I cannot even play in crossfirex, Cod world at war, freezes up on map load, not loading map, when the map loads, the game freezes, cod4 did the same, found a fix for it, gotta have x16AA enabled which is bs, drops my frame rates down significantly as if i was using just one 5830 because I've gotta have AA enabled @ 16 to play, whats the point of crossfirex if I gotta enable this, btw this little fix don't work for WAW. I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all. I pick ATI couple years ago because of price/performance, then a 5830 ended up in my hands, led to mining, led to having two 5830's. I had a nvidia 6600 couples years back, and that 6600 still performed better on one of the oldest games I've got, cod1 than even using 2 5830's. It's a choice of preference in what card you want or have expenses you have to get what you want. Ionno, there is just something about ATI/AMD cards, I don't like using for gaming, somehow frame rates get locked. Nvidia can/could release extra frame rates for the games I play. This is just my opinion, and please don't bash me for my opinion . I'll stick with ATI/AMD for bitcoin mining, once I am able to get a hold of a high end nvidia card, I'm going for it, strictly for PC Gaming.
|
|
|
|
Fiyasko
Legendary
Offline
Activity: 1428
Merit: 1001
Okey Dokey Lokey
|
|
October 10, 2011, 06:57:52 PM |
|
Shit man, not any fan here but, I have only 4 or 5 pc games, and one of my favorites I like to play, I cannot even play in crossfirex, Cod world at war, freezes up on map load, not loading map, when the map loads, the game freezes, cod4 did the same, found a fix for it, gotta have x16AA enabled which is bs, drops my frame rates down significantly as if i was using just one 5830 because I've gotta have AA enabled @ 16 to play, whats the point of crossfirex if I gotta enable this, btw this little fix don't work for WAW. I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all. I pick ATI couple years ago because of price/performance, then a 5830 ended up in my hands, led to mining, led to having two 5830's. I had a nvidia 6600 couples years back, and that 6600 still performed better on one of the oldest games I've got, cod1 than even using 2 5830's. It's a choice of preference in what card you want or have expenses you have to get what you want. Ionno, there is just something about ATI/AMD cards, I don't like using for gaming, somehow frame rates get locked. Nvidia can/could release extra frame rates for the games I play. This is just my opinion, and please don't bash me for my opinion . I'll stick with ATI/AMD for bitcoin mining, once I am able to get a hold of a high end nvidia card, I'm going for it, strictly for PC Gaming. Just a noob question here: Have you gone into the CCC and: Set all to application controlled. Turn OFF AMD optimised surface format for textures and tessalation (vSyncers use this) Turn on OpenGL triple buffering
|
|
|
|
|
Gabi
Legendary
Offline
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
|
|
October 10, 2011, 07:39:43 PM |
|
I had an ATI 1900, no problems.
Then had an ATI 3870, no problems.
Now i have an ATI 6950, once more no problems at all.
Everything work fine. All games, everything.
|
|
|
|
saethan
Newbie
Offline
Activity: 42
Merit: 0
|
|
October 10, 2011, 08:59:07 PM |
|
I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all.
So, what monitor are you using where fps that high actually matters? And don't get me started on what the human brain is actually able to process...
|
|
|
|
Gabi
Legendary
Offline
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
|
|
October 10, 2011, 09:02:53 PM |
|
It seems on some game (like quake 3 and maybe some cod) if you have like 333 fps you can do more things, jump more or something like that.....
|
|
|
|
DeathAndTaxes
Donator
Legendary
Offline
Activity: 1218
Merit: 1079
Gerald Davis
|
|
October 10, 2011, 09:05:15 PM |
|
Imaginary land.
Funny bashing a card for "only" getting 110fps when virtually all LCD only display 60 frames per second anyways. Very few are 72 or 75Hz even then 100fps is more than enough to drive them to their limit. Remember an LCD is a physical device it does take some time to physically pivot the liquid crystal (via electrical impulse) and alter how much light that is emitted. Doesn't matter how many fps a videocard can create in memory it still takes time for the actual crystal to align.
|
|
|
|
Gabi
Legendary
Offline
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
|
|
October 10, 2011, 09:15:44 PM |
|
Yes but it's not about the monitor, it's about game physics glitches and fps.
|
|
|
|
saethan
Newbie
Offline
Activity: 42
Merit: 0
|
|
October 10, 2011, 09:19:02 PM |
|
If for some reason a game has a difference in how it works between 110fps and 333fps, there is some sort of crazy programming going on in the background, unless the difference is being caused by the CPU and not the GPU, or it's PhysX compatibility problems among the ATI/AMD cards.
[edit] One last thing, did crossfire even -exist- when Q3 was released? That's also the same graphics engine used in CoD1. The reason I ask is the person I originally replied to was comparing a 6600 nvidia to crossfire 5830s in CoD1.
Though a single 5830 should still smoke a 6600 nvidia unless the Q3 engine was designed with Nvidia specs in mind (which they did for Quake 2 and 3dfx - my voodoo 3 outperformed much faster GeForces in Quake 2 for years - so I wouldn't be surprised if they did the same for Quake 3 and Nvidia).
|
|
|
|
pekv2
|
|
October 10, 2011, 10:26:58 PM Last edit: October 10, 2011, 10:47:11 PM by pekv2 |
|
I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all.
So, what monitor are you using where fps that high actually matters? And don't get me started on what the human brain is actually able to process... I'm using a 17" monitor. This is not about what I can see. I clearly said in my post to take advantages of the games I play. It seems on some game (like quake 3 and maybe some cod) if you have like 333 fps you can do more things, jump more or something like that.....
Gabi has it correct, with 125 or 333 FPS, you can jump higher, get onto ledges, boxes, taken advantage of the games. This isn't console, this is PC Gaming at the best. Imaginary land.
Funny bashing a card for "only" getting 110fps when virtually all LCD only display 60 frames per second anyways. Very few are 72 or 75Hz even then 100fps is more than enough to drive them to their limit. Remember an LCD is a physical device it does take some time to physically pivot the liquid crystal (via electrical impulse) and alter how much light that is emitted. Doesn't matter how many fps a videocard can create in memory it still takes time for the actual crystal to align.
Well, that will be your little secret. Yes but it's not about the monitor, it's about game physics glitches and fps.
Gabi is correct once again. If for some reason a game has a difference in how it works between 110fps and 333fps, there is some sort of crazy programming going on in the background, unless the difference is being caused by the CPU and not the GPU, or it's PhysX compatibility problems among the ATI/AMD cards.
[edit] One last thing, did crossfire even -exist- when Q3 was released? That's also the same graphics engine used in CoD1. The reason I ask is the person I originally replied to was comparing a 6600 nvidia to crossfire 5830s in CoD1.
Though a single 5830 should still smoke a 6600 nvidia unless the Q3 engine was designed with Nvidia specs in mind (which they did for Quake 2 and 3dfx - my voodoo 3 outperformed much faster GeForces in Quake 2 for years - so I wouldn't be surprised if they did the same for Quake 3 and Nvidia).
The game might be designed like that, but I've always felt ATI/AMD locked something down from getting high FPS from burning up the card. This was also speculated towards ATI/AMD while using Furmark benchmark program. But all in all, these little fps tricks work for COD1 & United offensive, COD2, COD4, CODWAW, BO, and most likely what ever other game that used the same gaming engine. Edit: On another note: GTX 480/490 GTX 580/580 x2/590 would throw cod games over 1000 FPS, and if PB is enabled, you will get kicked for it, but if PB is disabled, there will be no problem. AMD 6990 would probably get 250 fps in cod games.
|
|
|
|
Fiyasko
Legendary
Offline
Activity: 1428
Merit: 1001
Okey Dokey Lokey
|
|
October 10, 2011, 11:06:51 PM |
|
Who said anything about having Vsync actually enabled?, Anyone with 61+fps usually has vsnc off
Fuck lol im hitting 380fps avrg of CS:S Maxxxed out
|
|
|
|
pekv2
|
|
October 10, 2011, 11:25:35 PM |
|
Who said anything about having Vsync actually enabled?, Anyone with 61+fps usually has vsnc off
Fuck lol im hitting 380fps avrg of CS:S Maxxxed out
Out of all the time I've heard of counter strike, I've actually never played it. Is it really good?
|
|
|
|
saethan
Newbie
Offline
Activity: 42
Merit: 0
|
|
October 10, 2011, 11:57:56 PM |
|
http://en.wikipedia.org/wiki/IW_engineNot exactly popular outside the CoD line. I'm going to bet taking advantage of something weird in particular game engines that means a real difference between 110 and 333 fps(I take this comparison from the 110 333 mentioned earlier) is a very small niche in the overall gaming graphics market. But hey, if you'll throw the cash at it, more to ya. Jack: vsync or not, you're only seeing however many of those frames your monitor is capable of displaying. If 380fps means you can do funky tricks people with 90fps can't, then there's something screwy with the engine imho(or it's just old enough that, when designed, they didn't even bother considering multiple-hundreds of fps), but when it comes to what you're actually seeing, you're only seeing what vsync would show you.
|
|
|
|
|