The first step will likely be an FPGA.
The second someone gets there they have enough "well defined" that an ASIC
is not too far off.
Many, many computing activities can happen in ASICs if they are well planned
and ASICs can most certainly work with large chunks of memory!
Remember the key part of ASIC means "application specific." The only way to limit
the growth of ASIC potential would be to change the algorithm so frequently
that they can't get to tape out before you change the design again