I am not sure how you perform elliptic curve cryptography (ECC) calculations on a GPU as CUDA GPUs do not currently support 128-bit integers. Could you please explain your implementation, preferably with some code?
First page, link to GitHub, go look at the code.This program has already solved 2 keys, at 109 and 114 bits.