The only difference between artificial intelligence in 1970 and 2022 is the amount of data chips can process.
In past decades, there were no processors with the computational capacity to calculate every possible move in chess.
Algorithms and tests came out to calculate processing with chess afaik. They still can't calculate what every move will do but they can use minimax and other algorithms to determine what the best immediate move will be.
We did still need the smaller computing chips for that but also better algorithms. There were thoughts a few years ago that they couldn't make a computer better at playing go (a strategy game) than normal players but one managed to compete using a neural network to do it.
Self driving AI cars did not become possible until processors had improved to a point where they could effectively process large streams of data fed to them by sensors and navigational equipment.
This is a lot more accurate since parallel processing a cheap peripherals came about. A lot of phones are able to compete with regular computers now too because of how compact technology has become.
It is possible that we're already seeing evidence of this happening. While it may have taken many many years for people to accept heliocentrism and the concept of the world being a sphere, rather than flat. Today people might be far more willing to test, challenge and overturn their base assumptions. It could be one indication of human intelligence growing and developing at a faster rate, in contrast to previous eras of history.
This will certainly be the case with the amount of knowledge sharing too that's available and free verifiable informstion that's given. As long as information can be verified (/proven) and is interesting/useful to the prover in some way then it'll be beneficial and memorable for them as long as they know how to retain it and remember what was fact and what wasn't.
Sadly, while most generic scams follow the same trends, some will always be outliers and innovative so there might be a long time before everyone understands and adapts to them (take the transition with crypto from ponzi schemes that relied on trust to signing airdrop tokens with malicious contracts to electrum's phishing attack).
People are bound by their IQ, and IQ is purely genetic. Limiting environmental factors can reduce IQ (like malnutrition in poor countries) but your upper bound is still limited.
The rapid rise in information is mainly due to the internet, not an inherent upward trajectory in human intelligence. Over time, it's been recorded that human IQ's are increasing, but this is a generational observation and not attributable to what computational physics has given us.
Environmental limits are less profound now than they have been historically (who knows for how long that'll last though).
Information accessibility has become easier generation to generation though. Schools aren't run by factories or transitioning from that and libraries and other publicly available spaces seem.more widespread than they were (as well as transportation making it easier to get around) - diesel/steam ships instead of rowing/sailing ones (and fewer people being limited to however far they or their horse could walk - if they had any free time they didn't want to spend with family).
I think that the level of access to education is growing on the planet, and the level of critical thinking, on the contrary, is falling. People in general are becoming more infantile, more suggestible.
I think people have always been infantile in their ways they've just found ways to hide it in the past/haven't been explicit with it or to pay people to hide it for them (in case of people considered nobles - royalty seems known for having jesters historically).
This might depend on which demographics you think have become more infantile though too.