You’re right AI was created to reduce the workload because there are certain tasks that AI can handle. However, if this leads people to become lazy that’s wrong, because in my opinion that would mean we’re completely dependent on AI and that shouldn’t be the case.
That is pretty much the only thing that is has led to recently, laziness and occasionally even to the death of stupid humans who are willing to listen to LLM chatbots (whether they are children or adults is irrelevant). AI can not safely handle any kind of task these days -- you must always check the work that it produces. If you manipulate it in a way it can even fail to summarize two numbers such as 5 plus 5. This shows that there is no cognition involved at all, and as much as LLMs tend to hallucinate their knowledge, humans are hallucinating even more that there is any kind of cognition here involved. Just because it is able to solve certain types of tasks, that does not mean it is on the level of a human of any kind of age -- otherwise many of my advanced algorithms are on the level of humans with 10 Phd's.

We need to use AI in moderation even though there are no set limits, we shouldn’t rely on it entirely.
When has this worked? Most humans do not want to set a limit on any kind activity that is damaging to them. There should be a variety of limits, but that does not necessarily mean that they will be effective.
Yes, you are right in present time technology is evolved according to time and need but without refusing human worth. In different fields individuals becomes more active due to AI instead of become lazy man.
This is literally not a thing aside from a small subset of AI users, with most focus being on software engineers. Just because you can spew out more bullshit work, that does not mean that you have become more active or that you are working better.
And also AI gives you best opportunities for learning and still humans are required to updated these systems and also to train people.
It does not do that at all any more than a search engine does (which it does not), you are hallucinating.