Be it Scrypt-Adaptive-Nfactor, or scrypt-jane or whatever... even litecoin devs claimed there'll never be a GPU miner and now there's an ASIC on the way.
"But my algo requires an exponentially increasing amount of memory."
And ASIC miners can't have more than 10KB or memory, right? And they can't have modular memory slots like those DDR3. As a result --
Vertcoin N-factor increases with time to stay one step ahead of any possible ASIC development.
But magically GPU automatically increase in memory as the difficulty increases. But ASIC magically cant. WOW.
Reality is -- you can make an ASIC to take modular memory, but that'll never happen for a GPU. So these algos are GPU resistant and ASIC friendly in reality.
These devs are certified dumb people.
At best you can make your algo GPU resistant -- it's the same reason why GPU can't follow CPU instructions.
If you truly want a CPU only coin, it has to be a CPU instruction (or better yet, a changing CPU instruction) cause the CPU is also an ASIC for the instruction. If you want a GPU coin, it has to be changing OpenGL calls, cause GPU is ASIC for OpenGL.
Apart from that, all you can do to delay an ASIC is make the the algo complicated like with darkcoin, quark, chaincoin etc...
Speaking of which quark algo is the best yet to resist a GPU miner.