Here is an interesting angle to the artificial intelligence discussion which is seldom mentioned.
Machine learningThe term machine learning was coined in 1959 by Arthur Samuel, an American IBMer and pioneer in the field of computer gaming and artificial intelligence.[10][11] A representative book of the machine learning research during the 1960s was the Nilsson's book on Learning Machines, dealing mostly with machine learning for pattern classification.[12] Interest related to pattern recognition continued into the 1970s, as described by Duda and Hart in 1973.[13] In 1981 a report was given on using teaching strategies so that a neural network learns to recognize 40 characters (26 letters, 10 digits, and 4 special symbols) from a computer terminal.[14]
https://en.wikipedia.org/wiki/Machine_learning#History_and_relationships_to_other_fields ....
Neural NetworkMcCulloch and Pitts[8] (
1943) created a computational model for neural networks based on mathematics and algorithms. They called this model threshold logic. The model paved the way for neural network research to split into two distinct approaches. One approach focused on biological processes in the brain and the other focused on the application of neural networks to artificial intelligence.
In the late 1940s psychologist Donald Hebb[9] created a hypothesis of learning based on the mechanism of neural plasticity that is now known as Hebbian learning. Hebbian learning is considered to be a 'typical' unsupervised learning rule and its later variants were early models for long term potentiation. These ideas started being applied to computational models in
1948 with Turing's B-type machines.
Farley and Clark[10] (
1954) first used computational machines, then called calculators, to simulate a Hebbian network at MIT. Other neural network computational machines were created by Rochester, Holland, Habit, and Duda[11] (
1956).
Rosenblatt[12] (
1958) created the perceptron, an algorithm for pattern recognition based on a two-layer learning computer network using simple addition and subtraction. With mathematical notation, Rosenblatt also described circuitry not in the basic perceptron, such as the exclusive-or circuit, a circuit whose mathematical computation could not be processed until after the backpropagation algorithm was created by Werbos[13] (
1975).
https://en.wikipedia.org/wiki/Neural_network#History Most assume artificial intelligence is a new field of IT science. Looking at these wiki pages on neutral networks and machine learning we can see that the origins of AI research actually date back to the 1940s and 1950s. Artificial intelligence research in math and computation has been around for a very long time.
Recent breakthroughs in AI come from moore's law and the number of transistors in a given area doubling. Which could mean all of our future progress in AI depends on the continuation of moore's law. And that we have a lack of progress being made on the software side of AI, which may not have advanced much over the span of decades.