The 12V, 5A is typically the electrical rating for a hub taking external power as the type of power supply that you should connect: i.e. 12V, and capable of handling at least 5A. That doesn't mean the hub will always draw 5A, but the hub itself is designed such that it at least theoretically can - though this is often based on datasheets, rather than actual measurements (i.e. using test equipment to force a particular current draw and then see if the components get too hot / trigger an internal shutoff, etc.).
The differences between hubs is in what components they use to regulate the voltage down to 5V and how it is laid out electrically. If you're familiar with typical small indicator LEDs, then you might consider a small 1/8th Watt resistor to 'regulate' current for an LED. But replace that small LED with one used for lighting purposes, and that 1/8th Watt resistor will get way too hot which may change characteristics or even cause the resistor to just burn up. So you'd have to replace it with, say, an 0.5 Watt resistor. Or, better yet, you'll use a linear voltage regulator that can handle maybe 2A. Better yet, a switching regulator with components that can handle 5A, etc. It's that choice of components that determines how much current the hub itself can handle.
( The electrical layout also matters - specifically trace widths. A very narrow trace can't handle nearly as much current as a nice wide trace or a trace that has been tinned over. Take for example a typical bit of wire, a small bicycle bulb, and a 9V battery and you can light up that bulb no problem. Use steel wool, on the other hand, and the steel wool will ignite, as each steel wool fiber has a very low current carrying capacity. )
So what makes a hub beefy is how much current it can handle. Hubs slightly complicate things because the current on its input (e.g. that 5A) does not necessarily translate in how much it can handle per port. Let's say it has 4 ports and each can handle 2A; 4*2A = 8A = exceeds what it can handle on its input. Alternatively, say each port can only actually handle 1A, then it's all good and well that the engineers specced its design to 5A and the people slapping stickers onto the thing and putting print on the packages copy that, but as each port tops out at 1A that's a bit moot - and there's definitely no drawing 5A from a single port.
But say you need 2A and each port can only deliver 1A: that's when you can use a Y cable (or other such constructions) to basically divide the load (2A) across two ports (2A / 2 = 1A). On a cheapo hub, that's not much of a problem - on things like a laptop, I'd be more cautious; no point in potentially 'ruining' a $500 device if you can instead mess around with a $15 one.
That's all somewhat unrelated to USB 3.0 vs 2.0. USB 3.0 is supposed to be able to deliver up to 900mA, USB 2.0 up to 500mA . However, a USB 2.0 hub may well be able to deliver 3A on a single port. As long as it's a minimum of 500mA, they're okay to slap a USB 2.0 label on it. In short, I wouldn't rely on USB 2 vs USB 3 in terms of assuming current capability, and instead look up the exact specs - specifically: How much current can it deliver on a single port, and how much current can it handle total. The input voltage is something you can ignore in terms of the hub itself - it's only something you need to know in order to figure out what voltage rating power supply to look for / hook up.
If things are still a bit confusing, I would highly recommend sticking to some of the suggested hubs that people have already tested and are known to work well