Bitcoin Forum

Other => Beginners & Help => Topic started by: Fingler29 on December 04, 2011, 08:36:56 AM



Title: Noob question about hardware: Why so ATI dominant?
Post by: Fingler29 on December 04, 2011, 08:36:56 AM
I am not a programmer.


but I do know that such a massive, 100% consistent, discrepancy in performance between ATI and Nvidia hardware with the existing Mining software is not just:


"DURRR ATI IS BETTAR BUCUZ THEY HAZ BETTER DESIGN!"


thats bullshit.


what is it?  was the software developed FOR ATI hardware?  were the developers fed up with Nvidia's PhysX monopolistic microsoft-esque bullshit?  is Nvidia's SDK clunky and hard to work with?

what?

there must be a REAL REASON.  Because I can tell you right now, that even the best programmers and comptuer scientists will never be able to corroborate why a particular hardware design is better than any other with why their software works better on a particular hardware system.


especially when we are talking about what amounts to the absolute simplest mathematics on the planet.


it would take more than 1 "fresh," very intelligent Ph.D. in EE/solid sate physics to even begin to expound upon that subjet, and probably more than 1.


the discrepancies I havve seen between ATI and Nvidia with Mining look like the discrepancies I have seen between PhysX based benchmarks in the past.  I think that is the real situation.  I hope my implication there makes sense to everyone.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: worldinacoin on December 04, 2011, 09:06:29 AM
It doesn't really need a PhD

https://en.bitcoin.it/wiki/Mining_hardware_comparison


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: P4man on December 04, 2011, 09:55:03 AM
Sounds like we have another nvidia fan boy that thinks all bitcoin miners are amd fanboys.

Its really simple, AMD cards have a different architecture that happens to be massively better for bitcoin mining; they have many more but much simpler shaders than nvidia cards (which have fewer, but more complex). For games these different approaches tend to be competitive, for floating point math, nVidia is typically far ahead, but for simple integer math like bitcoin mining, AMD is the obvious choice.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Fiyasko on December 04, 2011, 09:57:00 AM
I smell someone who Loves nvidia cards but cant understand why the Fucking Suck at mining,
Heres the thing

MAJORITY(and i Seriously dare anyone to challenge me on this) Of newer graphics that look "pretty" (and by newer i mean like the higher end dx9 stuff and all the dx11) Are done by using Shader values.

AMD cards Do Not have a "Shader Clock" Flatout dont need one. because they pack Alot more Stream Proccessing Cores to do the "Shadey work" instead.

Where as nvidia, Knows, That they do not NEED nearly as many stream proccessor cores if VIDEO GAMES are Majority programmed on Shading Style grapics.

AMD cards have more Power, They do, They are stronger, They can fit ALOT more power in that "spot" where the "shader clock" doesnt exsist.
Nvidia cards have more Programming, They do, Sorry, Im an AMD fanboy FOR LIFE, And im sorry to say but i feel that Nvidia has Toatally rigged the market with all these "new style of graphics rendering!" (Ever since dx10.1 nearly All new graphic "Polish" tech is "shadey").

Bitcoins. Need no graphics, They are nothing but math, And in the computer thats Faster, And has More Cores, You get more work done, Wich results in more bitcoins mined.

Heres a Perfect example
5830
321Mhash/sec

1000core clock

1120Stream Processing cores

Cost? About $139

While the BEST nvidia card (correct me if im wrong)GTX590

193.1Mhash/sec

1215 "Clock"

512x2 Stream Proccessing Cores

Costs about $749.99

Now clearly, The 5830, Is Faster than the Best of Nvidias cards, AT CRUNCHING ONE TYPE OF NUMBER, And that One Type of number, Is what you need to mine bitcoins

Could someone chime in and remind me what the name of the calculation is?
WILD GUESSES AS TO THE NAME:
Floating point...
Interger....
I dont know... Whats that damn name for it.

Oh just another note, Those stats were taken off the mining hardware comparison chart


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Gabi on December 04, 2011, 10:38:20 AM
Nvidia fanboy spotted  :D



Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Fingler29 on December 04, 2011, 12:37:27 PM
thus far the answer has been "because the charts say so"
remember:  all math, all simulatios are all the same.  computational fluid dynamics looks the same as quantum mechanics at the most base level of the code.


integrals, derivatives, exponentials.... they are all just represented as sums and differences of polynomials.

the math of bitcoin mining is identical between both platforms (crucially, however, the way in which the math is "sent" to and "read from" the GPU may not be the same, amongst a host of other potential differences, but I digress here).

a satisfactory logically reaoned answer, supported with proof from spec/tech sheets from reliable sources (eg:  corroborated by anandtech, THG and/or the manufacturers themselves, to ensure that "claims" are genuine, and not just fluff).

1) the AMD boards have more cores
2) those cores are individually clocked higher than the Nvidia boards (and perhaps they can be threaded)

3) thus, because the math is so simple, the AMD boards can compute more calculations per second AND there are more of those cores, so those numbers literally multiply to add up to much higher performance.
4) Memory and bus architecture are irrelevant because of the relatively small number involved, and the fact that the total data involved in a complete calclulation is miniscule in comparison to the memory avaialble.


THAT would be a sound justification of why the HARDWARE is the true source of the discrepancy, and not the software implementation.



is that the case?  I dont know.  I just made those up as possible solutions that did NOT include the software's particular development as a POSSIBILITY for the discrepancy,


despite what people say, hardware is often less of a tie breaker than you might think.

that IS why physX and other market SOFTWARE BASED norms have given nvidia the edge (and probably Intel for that matter, and probably also ARM in the field of mobile).  It is also a lot easier to FORCE software down the throats of an industry than it is to force a particular hardware architecture (I am referring to Nvidia here, not AMD.   I am referring to Nvidia forcing their software/SDK down the throats of the gaming developers, not the debate above about bitcoin mining software).

my point here is to simply say that the fundamental difference that lies at the heart of this discrepancy is more likely to be software based than hardware based, simply because there are more ways in which the software itself can cause differences in performance.

for fucks sake, most software developers don't have a fucking clue how the hardware really actually works anyway. Its a lot harder for a software developer to maximize their hardware by themselves through tintkering and testing than it is for the hardware developer to siimply give out tools to make it easier for the developers to max the hardware, which gives whichever company who offers the better deal the edge.


the addage "software lags behind hardware" comes from the fact that its actually a lot fucking hardware to write sophisticated software than it is to decrease the gate size on a silicon wafer (up to a point, of course; a point that we have obviously reached) A graduate student, working by himself, can pull it off with minial support from faculty, whereas it usually takes teams of seasoned veteran programers to churn out high quality software.

in terms of "fanboyism"

1) I don't play video games.   I would rather not expound upon my opinions of people who attempt to justify spending lots of money for the ability to play video games at higher frame rates and to be able to see the grass on the simulated ground appear more realistic

2) I have a 6 year old HP laptop with a single core, intel core duo (not even a core 2 duo) with 2 gb of memory and onboard video.  I use 2 of my 4 USB ports to juggle between external hard drive enclosures to utilize a stack of equally old 3.5" internal drives ranging between 250 to 500 gb.  Im not a gamer.  I also have a 1st generation (literally first generation of the first generation) xbox that I won from Taco Bell.  It has an Xecuter 3 mod chip and I use it as a media center with 1st generation 1080p Samsung DLP (as in:  the first 1080p DLP they ever sold) with audio piped through a 13 year old onkyo receiver.  Point is:  I don't pay attention to what is new.  I don't care either.  Everything I have is sufficient for my needs.  when 4k TVs and boxes start coming out, I will upgrade.

I dont give a flying fuck about fanboyism.   Except for my Car.  Fuck yeah Nissan.  Fuck all yall Euro, 'merikan, Australian, and other alterntive JDM shit.  Nissan reigns supreme.  fuck rotaries (mazda), fuck yamaha designed engine components (toyota), fuck lol-crank-walk (Mitsubishi), and fuck diesel-engine sounding broke-transmission bulbous monstrosities (Subaru).

haha Im joking.  but maybe Im not.

no I really am.  anyone who likes cars is a friend.  Even if you like to drive around with a live rear axle or pushrods.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: worldinacoin on December 04, 2011, 12:45:10 PM
thus far the answer has been "because the charts say so"
remember:  all math, all simulatios are all the same.  computational fluid dynamics looks the same as quantum mechanics at the most base level of the code.


integrals, derivatives, exponentials.... they are all just represented as sums and differences of polynomials.

the math of bitcoin mining is identical between both platforms (crucially, however, the way in which the math is "sent" to and "read from" the GPU may not be the same, amongst a host of other potential differences, but I digress here).

a satisfactory logically reaoned answer, supported with proof from spec/tech sheets from reliable sources (eg:  corroborated by anandtech, THG and/or the manufacturers themselves, to ensure that "claims" are genuine, and not just fluff).

1) the AMD boards have more cores
2) those cores are individually clocked higher than the Nvidia boards (and perhaps they can be threaded)

3) thus, because the math is so simple, the AMD boards can compute more calculations per second AND there are more of those cores, so those numbers literally multiply to add up to much higher performance.
4) Memory and bus architecture are irrelevant because of the relatively small number involved, and the fact that the total data involved in a complete calclulation is miniscule in comparison to the memory avaialble.


THAT would be a sound justification of why the HARDWARE is the true source of the discrepancy, and not the software implementation.



is that the case?  I dont know.  I just made those up as possible solutions that did NOT include the software developer POSSIBLE, I repeat this so take it to heart:  POSSIBLE discrepancies in how the software was developed offer a significantly larger number of reaons why there is this discrepancy  that is why I mentioned it first.

despite what people say, hardware is less of a tie breaker than you might think.



that IS why physX and other market SOFTWARE BASED norms have given nvidia the edge.  It is also a lot easier to FORCE software down the throats of an industry than it is to force a particular hardware architecture (I am referring to Nvidia here, not AMD.   I am referring to Nvidia forcing their software down the throats of the gaming developers, not the debate above about bitcoin mining software).


my point here is to simply say that the fundamental difference that lies at the heart of this discrepancy is more likely to be software based than hardware based


for fucks sake, most software developers don't have a fucking clue how the hardware really actually works anyway, its a lot harder for a software developer to maximize their hardware than it is for the hardware developer to give out tools to make it easier for the developers to ax the hardware


the addage "software lags behind hardware" comes from the fact that its actually a lot fucking hardware to write sophisticated software than it is to decrease the gate size on a silicon wafer  A graduate student, working by himself, can pull it off with minial support from faculty, whereas it usually takes teams of seasoned veteran programers to churn out high quality software.



in terms of "fanboyism"


1) I don't play video games.   I would rather not expound upon my opinions of people who attempt to justify spending lots of money for the ability to play video games at higher frame rates and to be able to see the grass on the simulated ground appear more realistic


2) I have a 6 year old HP laptop with a single core, intel core duo (not even a core 2 duo) with 2 gb of memory and onboard video.  I use 2 of my 4 USB ports to juggle between external hard drive enclosures to utilize a stack of equally old 3.5" internal drives ranging between 250 to 500 gb.  Im not a gamer.  I also have a 1st generation (literally first generation of the first generation) xbox that I won from Taco Bell.  It has an Xecuter 3 mod chip and I use it as a media center with 1st generation 1080p Samsung DLP (as in:  the first 1080p DLP they ever sold) with audio piped through a 13 year old onkyo receiver.

I dont give a flying fuck about fanboyism.   Except for my Car.  fuck yeah its a nissan 240sx  (that is where my money goes lol).  But no I don't drift.  never.  Drifting is an abomination.

I had used both ATI and Nvidia, the charts also collaborate with what I had seen.  So had the rest of us.  If you don't care anything, goodbye.  You don't even need to bother to ask since you can't accept anything other than what you are thinking


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Fingler29 on December 04, 2011, 12:58:24 PM
Quote
I had used both ATI and Nvidia, the charts also collaborate with what I had seen.  So had the rest of us.  If you don't care anything, goodbye.  You don't even need to bother to ask since you can't accept anything other than what you are thinking


didnt ask measurements.  I have google as well.  I can type in "which GPUs are best for BTC mining"


I asked a question that was clearly not relevant in the noob section.  I have no alternative. place to post it.



Perhaps the entire website is not the place to post such a question.



I am asking for a question that is based on genuine information, not speculation.   I highly doubt there are very many electrical engineers on this board who can explain why the simple integer math associated with bitcoin mining operates more efficiently (in terms of time, aka:  hashs/sec) than they do on Nvidia boards.


realistically I will almost certainly never get an answer here.



the answer that "simple math is done quicker because there are more cores, which were designed to supplement for the lack of complexity in those cores when dealing with more complex math associated with graphics" is nonsensical on a number of levels.


all math is simple at the base level.  The biggest determinant factor is how much memory is there, how quickly the memory communicates with the processor and how long the job is


short job, short overall calculation time pretty much eliminate most of the differences in hardware.




Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: P4man on December 04, 2011, 01:22:17 PM
What on earth makes you think all math is the same or that it is memory bound?
For a start there is quite a fundamental difference between floating point and integer math. The simple truth is that AMD cards have 2-3x more ALUs than nvidia cards and can therefore process 2-3x as many 32 bit integer calculations per clock. Another simple truth is that SHA hashing (and many other similar cryptographic functions) is greatly sped  up by rotate right  register operations. AMD cards have a dedicated 1 cycle hardware function for this, nvidia does not and requires 3 clock cycles for this.

Memory bandwidth or capacity is absolutely a non issue for hashing. Miners clock the memory as low they can to save power consumption and even to speed up mining (!) and bitcoin mining takes like a few megabyte of vram. Anything more is wasted for mining.

Please get a clue.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Gabi on December 04, 2011, 02:27:54 PM
That's why we have a newbie section, to avoid trolls go trolling elsewhere and forcing them to troll here.

And, infact, we have a troll here.

Dear troll here is a protip: the computing behind bitcoin mining is NOT a secret, and is indeed very simple: sha-256 hashing. It run faster on ATI because uh guess what? ATI can compute these things FASTER than nvidia, due to it's hardware.

Yes, that's all. Nvidia just sucks for mining.

You speak about software implementation. As i said, the software is very simple, go and try to make it run faster on Nvidia if you want. Hire someone, hire everyone. Good luck, call me when you have a code faster than a same priced ATI.


As for physx... oh LOL. Do you realize that it's NVIDIA CODE? And that it's closed source? Guess what, it run well on nvidia? More like it run ONLY on nvidia. Lololol

Quote
all math is simple at the base level.  The biggest determinant factor is how much memory is there, how quickly the memory communicates with the processor and how long the job is
Protip: if card 1 make the same computing in 1 cycle while card 2 take 10 cycles, card 1 will be 10 times faster.

Now go troll elsewhere.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: worldinacoin on December 04, 2011, 03:46:57 PM
If the OP can't accept that ATI is superior, go on and get your Nvidia to mine bitcoins and good luck!


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Gabi on December 04, 2011, 03:49:49 PM
It seems he think that the mining software is biased to run better on ATI and to be purposely inefficient on nvidia. That they purposely made it slow on nvidia

Too bad his trolling is epic fail, bitcoin mining is not a secret and it's very simple, he can try to make his own mining software and experiment with it and discover WHY nvidia sucks at mining.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: 714 on December 04, 2011, 03:58:11 PM
https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU

Decent comparison ATI vs. Nvidia. The Nvidia hardware does better on some tasks. Bitcoin is not one of them. The instruction set of the R5xxx and higher ATI chipsets makes Bitcoin very efficient.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: worldinacoin on December 04, 2011, 04:02:44 PM
If you look at his other post regarding using organizational resources to do bitcoin mining, one would suspect that his organization may be using Nvidia so much that he make himself believe that Nvidia is better.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Gabi on December 04, 2011, 04:06:57 PM
http://donald-harris.com/wp-content/uploads/2010/10/picard-facepalm2.jpg


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: dark_st3alth on December 04, 2011, 08:08:24 PM
I'll give a real answer instead of what some 10 year olds posted above. /\


I'm a Nvidia lover as well, don't get that wrong. It seems the reasons are:

1. It has more "stream" processors, which can be thought of as CUDA cores.
It would seem that miners are not using the CUDA cores as well, but that's another topic.

2. AMD/ATI cards have a SHA512 (checksum) instruction on them, which speeds up calculations for mining. nVidia cards take 2 or 3 instructions to do this (as of now).
Really, this could change in the near future.

3. AMD/ATI cards are cheaper. It's less costly to setup the miner then with nVidia cards.
I posted on here that ATI/AMD cards probably have less quality. Just google "ATI loud fan".


As the little kids are arguing about PhysX, I'll explain that as well.

PhysX is used for PHYSICS calculations. Here's a nice block for all of you to read:


Quote
Before PhysX, game designers had to “precompute” how an object would behave in reaction to an event. For example, they would draw a sequence of frames showing how a football player falls on the ground after a tackle. The disadvantage of this approach was that the gamer always saw the same “canned” animation. With PhysX, games can now accurately compute the physical behavior of bodies real time! This means that the football player will now bend and twist in all different ways depending on the specific conditions associated with the tackle – thus creating a unique visual experience every time.

PhysX technology is widely adopted by over 150 games and is used by more than 10,000 developers. With hardware-accelerated physics, the world’s leading game designers’ worlds come to life: walls can be realistically torn down, trees bend and break in the wind, and water and smoke flows and interacts with body and force, instead of just getting cut-off by neighboring objects.


And a little more:

Quote
PhysX is designed specifically for hardware acceleration by powerful processors with hundreds of processing cores. Because of this design choice, NVIDIA GeForce GPUs provide a dramatic increase in physics processing power, and take gaming to a new level delivering rich, immersive physical gaming environments with features such as:

    Explosions that create dust and collateral debris
    Characters with complex, jointed geometries, for more life-like motion and interaction
    Spectacular new weapons with incredible effects
    Cloth that drapes and tears naturally
    Dense smoke & fog that billow around objects in motion

There, wasn't so hard was it guys and girls? They just wanted a simple answer.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Blind on December 04, 2011, 09:12:27 PM
I, for one, am very happy with this situation, AMD need every penny they can get, so they can pump it in R&D, so they can compete with Intel & nVidia, so there is healthy competition and not monopoly, so we as customers won't get ass raped too much. Go ATI!


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: 714 on December 04, 2011, 09:34:06 PM
Stick it with a fork, it's done.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Gabi on December 04, 2011, 09:43:44 PM
Quote
3. AMD/ATI cards are cheaper. It's less costly to setup the miner then with nVidia cards.
I posted on here that ATI/AMD cards probably have less quality. Just google "ATI loud fan".
Should i link exploding nvidia cards?  ::)


Quote
PhysX is used for PHYSICS calculations. Here's a nice block for all of you to read:
Wow so much yaddayadda

Quote
Before PhysX, game designers had to “precompute” how an object would behave in reaction to an event.
Bullshit.

Quote
PhysX is designed specifically for hardware acceleration by powerful processors with hundreds of processing cores.
Let me rewrite: PhysX is designed specifically to run on nvidia cards.
I wonder why, maybe because uh it's nvidia propietary code?

And lol at games using it. Yeah, to add like what, 2 particle?


There are other physics engine that run on all systems instead of nvidia only.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: dark_st3alth on December 04, 2011, 10:13:36 PM
Quote
3. AMD/ATI cards are cheaper. It's less costly to setup the miner then with nVidia cards.
I posted on here that ATI/AMD cards probably have less quality. Just google "ATI loud fan".
Should i link exploding nvidia cards?  ::)

I think your talking about defective cards. If that's the case, It's been identified and fixed. A brand new ATI card has a VERY loud fan problem, I just can't seem to find the video I watched a month ago.


Quote
Quote
PhysX is used for PHYSICS calculations. Here's a nice block for all of you to read:
Wow so much yaddayadda
I'm giving a straight answer that's correct. Where's your answer huh?

Quote
Before PhysX, game designers had to “precompute” how an object would behave in reaction to an event.
Bullshit.
[/quote]

I don't think your knowledgeable in that area sir/madam. It is quite well known, just take a search before you raise the alarms...

On top of that, I don't think nVidea would lie like that. Think about it next time before you say things. :)

Quote
PhysX is designed specifically for hardware acceleration by powerful processors with hundreds of processing cores.
Let me rewrite: PhysX is designed specifically to run on nvidia cards.
I wonder why, maybe because uh it's nvidia propietary code?

And lol at games using it. Yeah, to add like what, 2 particle?
[/quote]

No, it does make a difference. If it wasn't on the card, you would have to do the calculations on the CPU - and I hope your an expert on such things as it seems like your saying you are.

Take a look at games that use PhysX. Fluid dynamics is one such area you will need the GPU to do calculations.

Quote
There are other physics engine that run on all systems instead of nvidia only.

Ah, your talking about Havoc and such. Those are all CPU based, but lack a lot of "realism" to what they can do. Source games are a wonderful example. :)


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Fingler29 on December 04, 2011, 11:30:36 PM
I realize that I am asking a question here that is probably at the core of a lot of flame wars.  I promise I did not post it with that foreknowledge, although its pretty obvious given the fact that "fanboyism" is about as rampant as it gets with GPUs (probably only comparable to Ford vs Holden in Australia, where fist fights and stabbings are an occasional result).

I'm not a troll (lol how many times has that been said).  I do not disbelieve, discount, or ignore the statistitics.  They are there in black and white.  Its fact.  AMD cards are almost universally better (I say almost because there might have been maybe one single expensive nvidia card that was better than a super cheap AMD card, but I didn't pay too close attention).

I knew about that before I posted this thread, which is why I repeatedly referred to the "performance discrepancy."  that is why those posts that referred to them are kind of irrelevant.  I know.  those websites with reviews/benchmarks were the REASON why I am posting this thread.

shit.  I'm just asking WHY?

why?  why is it that 2 types of hardware, which are designed for essentially identical tasks:  namely CONSUMER ORIENTED graphics (like video games and movies and shit) produce such wildly different results?

AMD wants to make their shit work well with games.  So does Nvidia.  Everyone learns physics and electrical engineering from similar/same text books, and in many cases from the same PI or a PI who worked with his competitors PI because those PI's worked with the same PI (I cant remember the name of the statistic that measures the PhD "tree" back up to famous people like Einstein, laplace, debroglie, boltzmann, Bragg, feynman, etc. but that is what I am talking about here)

even taking into account the hurdles of patents and Intellectual property, how is it that 2 products, aimed at the same market and designed for IDENTICAL tasks produce such ABSOLUTELY WIDLY different results?  that is my question. 

the answers I have been given mostly sound like:  "because AMD is better, duh."
dark_st3alth  actually answered my question in a clear and cogent way and I do appreciate that.  Thank you. 

he also eluded to what I was implying with PhysX:

Quote
It would seem that miners are not using the CUDA cores as well, but that's another topic.

my point was "targeted" development.  I didnt mean to imply that the BTC mining programmers were like "fuck you nvidia we're only writing for AMD so all you NVidia fanboys can suck a dick."  I meant more along the lines of:  "some guys working in their spare time, who only had access to AMD gear and AMD SDKs developed with the tools they had, and the results are that the software works best with AMD."

maybe they tried to get Nvidia hardware donated and Nvidia said no!  who knows! But think it much more likely that the reason why the software works best on AMD is because it was designed with a focus on AMD hardware.
not maliciously, not angrily, not with some ill intent.  Just because that was the only option.

I would love to hear someone who actually knows about the development weigh in so that I can have that question answered.  Its just a question OIts posted in the noob section for gods sake.  I didnt pronounce it like a fact of god spoken from on high.  I am ruminating.  I am tossing around i

Ithought that was essentially the friggin purpose of a "forum":  a place to discuss things.  that is what I am trying to do.

instead i'm a troll, Im a fanboy, Im just here to start flamewars.
apparently that must be the case. 


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Fiyasko on December 05, 2011, 12:17:27 AM
Lock this fucking thread


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: 714 on December 05, 2011, 03:59:04 AM
Lock this fucking thread

I second that motion, this is a subject that is well covered elsewhere.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: worldinacoin on December 05, 2011, 04:31:23 AM
Lock this fucking thread

I second that motion, this is a subject that is well covered elsewhere.


+1


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Gabi on December 05, 2011, 02:36:10 PM
1)Havoc and such? CPU based? No.

http://bulletphysics.org/wordpress/

A physics engine that run on GPU via OpenCL

2)
Quote
If it wasn't on the card, you would have to do the calculations on the CPU
Of course if i don't use the card i use the CPU. The problem is, physx is made to run well on nvidia card. But it's not the only engine around

3)Please explain me where i am wrong, nvidia did not invent physics engine in gaming. "precompute"? Lolwut, it was possible to do the same thing that physx do before physx was created.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Fiyasko on December 05, 2011, 04:24:04 PM
1)Havoc and such? CPU based? No.

http://bulletphysics.org/wordpress/

A physics engine that run on GPU via OpenCL

2)
Quote
If it wasn't on the card, you would have to do the calculations on the CPU
Of course if i don't use the card i use the CPU. The problem is, physx is made to run well on nvidia card. But it's not the only engine around

3)Please explain me where i am wrong, nvidia did not invent physics engine in gaming. "precompute"? Lolwut, it was possible to do the same thing that physx do before physx was created.

THANKYOU GABI.

Now, I think that Gabi's post should be the Last. (or this what whatever the fuck)

I think the flame fest is at an even keel right now, And we need to Stop it, Stop the fire dammnit.

Gabi just Litup and Burned down and entire "controlled burn zone" Dont fucking walk in here and toss a jerry can'o'gas at him.

LOCK THIS THREAD, ALL DESIRED INFO OF THE THREAD OP'S QUESTION HAS BEEN ANSWERED


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: dark_st3alth on December 06, 2011, 03:22:37 AM
I'll add a little more gasoline if I may, to celebrate my 5th hour. :)

Quote
1)Havoc and such? CPU based? No.

http://bulletphysics.org/wordpress/

A physics engine that run on GPU via OpenCL

The reason is that they are supposedly partnered with ATI/AMD.

You will never see this commercially used popular (and I'm surprised a project like this even existed).

PhysX is used in almost every major game, and those that don't, use Havoc (which is CPU).


Quote
2)
Quote
If it wasn't on the card, you would have to do the calculations on the CPU
Of course if i don't use the card i use the CPU. The problem is, physx is made to run well on nvidia card. But it's not the only engine around

As answered above, PhysX:

A) The best
B) The most popular
C) Commercially backed, with Major's like VaLve.
D) It's GPU based, using a PPU.

Quote
3)Please explain me where i am wrong, nvidia did not invent physics engine in gaming. "precompute"? Lolwut, it was possible to do the same thing that physx do before physx was created.

Of course they didn't. Who said that where?

Idk if you are all into game mechanics, or developing (it seems not), games had to precompute (process long before, mainly stored in a file or memory saying what should happen) physics and thus was quite limited. As far as I remember, you would not be able to drive vehicles like you can do now.


Anyways, it would seem I am free. :D


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Fiyasko on December 06, 2011, 03:43:47 AM
I'll add a little more gasoline if I may, to celebrate my 5th hour. :)

Quote
1)Havoc and such? CPU based? No.

http://bulletphysics.org/wordpress/

A physics engine that run on GPU via OpenCL

The reason is that they are supposedly partnered with ATI/AMD.

You will never see this commercially used popular (and I'm surprised a project like this even existed).

PhysX is used in almost every major game, and those that don't, use Havoc (which is CPU).


Quote
2)
Quote
If it wasn't on the card, you would have to do the calculations on the CPU
Of course if i don't use the card i use the CPU. The problem is, physx is made to run well on nvidia card. But it's not the only engine around

As answered above, PhysX:

A) The best
B) The most popular
C) Commercially backed, with Major's like VaLve.
D) It's GPU based, using a PPU.

Quote
3)Please explain me where i am wrong, nvidia did not invent physics engine in gaming. "precompute"? Lolwut, it was possible to do the same thing that physx do before physx was created.

Of course they didn't. Who said that where?

Idk if you are all into game mechanics, or developing (it seems not), games had to precompute (process long before, mainly stored in a file or memory saying what should happen) physics and thus was quite limited. As far as I remember, you would not be able to drive vehicles like you can do now.


Anyways, it would seem I am free. :D

Im am going to fucking kill you if you keep thinking that GPU's Couldnt "Practically" and/or "publicly" and/or "commercially" and/or "residentially" ETC, Do Non-Precomputed physics on the GPU untill nVidiaPhysX was created.

What your saying is that my old PRE PhysX exsistence tech Radeon GPU, CANNOT (on it's gpu) Make a 3d Ball, Fall down, And Randomly (Fuck off i know there is no TRUE random) bounce

You say that if:
on a Pre PhysX Era ATI Radeon GPU
Perfect ball A drops, Falls and Smacks a Completeatly Flawed Polygon (such as a gravel path) that it Cannot, CANNOT do anything unpredictable. Because, By Computing law, That is Impossible to happen Due to the Flawless factor that all "possible" "bounceable" angles are All accounted for..... unless using your CPU or Nvidia's PhysX tech stylings.

That is just flatout stupid.

TL;DR
Link or lies.
Lock this fucking thread.
 


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: P4man on December 06, 2011, 08:48:31 AM
PhysX is an API, just like bullet, havoc etc. It can run on either CPU or GPU, its up to the developer. Its also an open API, if AMD wanted, they could implement it, but they chose not as nVidia owns the IP and somewhat understandably, AMD dont want to support that. But there are hacks around to enable physx on ATI cards.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Gabi on December 06, 2011, 01:12:49 PM
Here is a nice explanation about "physx on cpu"

Quote
http://techreport.com/discussions.x/19216
PhysX hobbled on the CPU by x87 code
Quote
x87 has been deprecated for many years now, with Intel and AMD recommending the much faster SSE instructions for the last 5 years. On modern CPUs, code using SSE instructions can easily run 1.5-2X faster than similar code using x87.  By using x87, PhysX diminishes the performance of CPUs, calling into question the real benefits of PhysX on a GPU.

Quote
http://www.rage3d.com/board/showthread.php?t=33965625
PhysX - Intentionally Slow on CPUs? RealWorldTech Investigates.
Quote
PhysX uses x87 because Ageia and now Nvidia want it that way. Nvidia already has PhysX running on consoles using the AltiVec extensions for PPC, which are very similar to SSE. It would probably take about a day or two to get PhysX to emit modern packed SSE2 code, and several weeks for compatibility testing. In fact for backwards compatibility, PhysX could select at install time whether to use an SSE2 version or an x87 version – just in case the elusive gamer with a Pentium Overdrive decides to try it.

But both Ageia and Nvidia use PhysX to highlight the advantages of their hardware over the CPU for physics calculations. In Nvidia’s case, they are also using PhysX to differentiate with AMD’s GPUs. The sole purpose of PhysX is a competitive differentiator to make Nvidia’s hardware look good and sell more GPUs.


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: Fiyasko on December 06, 2011, 06:21:13 PM
Here is a nice explanation about "physx on cpu"

Quote
http://techreport.com/discussions.x/19216
PhysX hobbled on the CPU by x87 code
Quote
x87 has been deprecated for many years now, with Intel and AMD recommending the much faster SSE instructions for the last 5 years. On modern CPUs, code using SSE instructions can easily run 1.5-2X faster than similar code using x87.  By using x87, PhysX diminishes the performance of CPUs, calling into question the real benefits of PhysX on a GPU.

Quote
http://www.rage3d.com/board/showthread.php?t=33965625
PhysX - Intentionally Slow on CPUs? RealWorldTech Investigates.
Quote
PhysX uses x87 because Ageia and now Nvidia want it that way. Nvidia already has PhysX running on consoles using the AltiVec extensions for PPC, which are very similar to SSE. It would probably take about a day or two to get PhysX to emit modern packed SSE2 code, and several weeks for compatibility testing. In fact for backwards compatibility, PhysX could select at install time whether to use an SSE2 version or an x87 version – just in case the elusive gamer with a Pentium Overdrive decides to try it.

But both Ageia and Nvidia use PhysX to highlight the advantages of their hardware over the CPU for physics calculations. In Nvidia’s case, they are also using PhysX to differentiate with AMD’s GPUs. The sole purpose of PhysX is a competitive differentiator to make Nvidia’s hardware look good and sell more GPUs.

You sir, Are all my better arguments, Cleanedup and Slammed down.
Flawless explanation, And by my own experiance, It's compleatly true.

Now then
Lock This Thread.
If you dont know how, Goto you first post and hit Edit, Then near the bottom left will be a small "Lock topic" button


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: unabridged on December 07, 2011, 07:32:12 PM
I think the problem is you are thinking that the amd GPUs and nvida GPUs are as similar as amd CPUs and intel CPUs. The CPUs have nearly identical instruction sets and meant to be somewhat interchangeable, while the GPUs each have their own entirely separate instruction set.

AMD = more cores, but each core is slower than an nvidia core
nvidia = less cores, but each core is faster than an AMD core

hashing is not really an intensive operation so the speed of the core doesnt really make a difference, what makes a difference is being able to do many at once so the amount of cores is what increases the hash rate.

An FPGA or ASIC takes this to the extreme, many many very dumb cores (but just fast enough to do the hash)


Title: Re: Noob question about hardware: Why so ATI dominant?
Post by: icvader on December 17, 2011, 06:00:43 AM
I'm gonna get murdered for this post :(.

I'm not sure if anyone truly answered your question. I am a programmer, i don't know if that helps... To the point. ATI/AMD has a one single operation that nvida does not have. It is called RightRotate aka BIT_ALIGN_INT. It is the fundamental basis for optimized bitcoin hashing. As it stands, with the most briliant minds working towards making nvidia operations as optimized as possible, It takes 3 operations; 2 shifts + 1 add :(. There are some people that are working their best to petition for an updated set of cuda instructions to get the RightRotate and LeftRotate included. Just getting it down to one operation is an amazing improvement. The other issue as you obviously read, is that amd just hase a massive amount more of the stream processors as compared to nvidia. the stream processors are what process the SHA-256 operations, be it nvidia or amd.

The rotate right (circular right shift) operation ROTR n(x), where x is a w-bit word
and n is an integer with 0 £ n < w, is defined by
ROTRn(x) = (x >> n) Ú (x << w - n).
Thus, ROTR n(x) is equivalent to a circular shift (rotation) of x by n positions to the
right.

This operation is used by the SHA-256, SHA-384, and SHA-512 algorithms.

I don't want to over complicate things, but i love teaching so here it goes. This is super simplified btw and i am sure someone will wince and bash me over the head for it. I apologize in advance...

The SHA-256 Hash, is an operation that creates a random number, in hexadecimal of course, based off of a value. In the case of bitcoin it would be the current Block, or "Target". The main goal of bitcoin hashing is to try and randomly create a hash that is less value than the target. For example, the current target is

0000000000000E86680000000000000000000000000000000000000000000000

getting a random hash of

0000000000000E86670000000000000000000000000000000000000000000000
                               ^
                              Lesser value
would "win" you the "Target" Block thus giving you 50 BTCs.

Based off of the current Difficulty the probability of winning is 0.0000000000000002015752067216838860908012520667398348450

The lower the difficulty the higher the value of the target and vice versa for the higher diffculty.

The hash also has to be verified as a hash, you can't obviously have a program just throw back, "oh I found the lower value wink wink" lol.

This guy David Perry gives an awesome explanation on why AMD is cornering the market on integer based calculations and proves its not just the "bitcoin" world.

http://bitcoin.stackexchange.com/questions/1523/bitcoin-alternative-designed-for-nvidia (http://bitcoin.stackexchange.com/questions/1523/bitcoin-alternative-designed-for-nvidia)

This SHA-256 stuff is and was mostly used for GPGPU processes by hackers. I wouldn't be surprised if it was group of hackers that originally wrote all the bitmining programs we use today. It is the dominate force right now in password cracking, encryption etc.

In closing, nvida, in order to keep up with market demands, will eventually have to start bringing back small integer math into their design to keep up with the next generation uses for things like full hardrive encryption, faster SSL handshakes, etc. Computer Security is ever evolving, once we start getting into the 10-50-100MB encryption algorithms, CPU processing, as it is now, will never be able to keep up; hell it can't keep up with our 2MB encryptions lol.

I don't know if you are a programmer, I assume you must have knowledge else you wouldn't be seeking more knowledge. It's addicting. Here is a post of the process of getting a target in really really simple C language.

http://pastebin.com/n8UEGA86 (http://pastebin.com/n8UEGA86)

Thanks,
icvader