According to

the wiki, difficulty can be calculated by dividing the maximum target by the current target.

However, I'm unclear on how exactly the target value returned from a getwork call is encoded (and more importantly, how it should be de-encoded to calculate difficulty using this method).

Do I simply swap it to little-endian (and if so, how?), or is there a more complicated process?

Example code (Python) using the current target:

>> target = "00000000000000000000000000000000000000000000006e8102000000000000"

>> target = process(Target)

>> maxtarget = "00000000ffffffffffffffffffffffffffffffffffffffffffffffffffffffff"

>> difficulty = int(maxtarget, 16) / int(target, 16)

>> print difficulty

6695826

Any advice would be appreciated.