The math is quite interesting, if you want to work it out. Given a word list of 466k, then each word can encode log2(466,000) = 18.83 bits of entropy. For a 132 bit seed phrase, this needs 132/18.83 = 7.01 words, which has to be rounded up to 8. If you used a wordlist of 474,861 words, then you could generate a 7 word seed phrase for 132 bits.
This is quite interesting indeed. So if your word list gets long enough, you'll need less seed words.
That might even make it easier to remember (if only I'd know what those words mean).
So if I create a list of every combination
from a to zzzzz, I get a very short seed:
julkt jtqbf hhocl qhtic bezsh kvgba
With 12 million "words", Python consumes a few GB memory and takes a while to create a new seed phrase. I expect this to get worse with much longer lists.
Of course, this takes away the "error correction" you'd have by using a dictionary word, so it's not really useful. But I'm
amazed Electrum can just restore this seed phrase without the seed words!