In the ideal case, we could devise an automatic and mechanical way to increase MAX_BLOCK_SIZE, so that we are not creating a central authority to decide what the MAX_BLOCK_SIZE ought be, and won't have to rely on their criteria of the day. The information that the future generations of Bitcoin users will have, will come from the block chain, which we also have today, telling us what is needed today. If we are able to solve the problem in a way that accommodates what the block chain is telling us, it may never need to be solved again! It may be able to adjust to changing environment similarly to the way the difficulty does.
Yes, we very much want to avoid doing something reasonable for today and pass this point along to future generations. Hard forks are not nice and having a committee decide on an appropriate maximum block size every so often is out of the question (a central point of failure).
Unfortunately, there's no easy way of doing what you'd like. Bitcoin itself has no conception of a market agent; it certainly can't distinguish between them or count them. Bitcoin itself can't know if the system is highly decentralised or if all the addresses and all the hash power are controlled by a single party.
We might be able to come up with a probabilistic or economic solution but no algorithm can measure decentralisation with certainty. Perhaps some blockchain-based metric may suggest that the system is decentralised (or under attack by an economically irrational agent) with high probability. However, I expect that any such algorithm, no matter how subtle, will yield an equally subtle attack where a single agent attempts to appear to be many agents.
There are 2 attack resistant proposals under discussion in this thead:
https://bitcointalk.org/index.php?topic=815712.0Both are self-correcting, based on the block chain, and can serve to avoid centralisation of decision making on this matter for the future.
One is based on block size, another is based on TX fees.
Both would take a preponderance of interests in order to "attack", in much the same way a 51% attack would do.
The key to resilience here is both in using self-correcting market influence, and limiting the amount of variance achievable. These are also the effective mechanisms of the difficulty adjustment algorithm.