The late 1980s saw a shift toward algorithms influenced by biological processes, and the rebirth of artificial neural networks (which were actually developed in the early 1960s), genetic algorithms, and such things as Ant and flocking algorithms. The 1990s saw increasingly sophisticated algorithms in all these areas and began the march toward today's world of machine learning, with its emphasis on statistical techniques used against large datasets. A sentence can often be a confusing mashup of mathematics, jargon from AI's five-decade history, and flawed metaphor (artificial neural networks aren't a great deal like real-world neurons, and genetic algorithms don't have much in common with meiosis and DNA recombination). But while there is a temptation to use a technique as a black box, I strongly believe that sustained success requires gaining an intuition into the underlying technique.
Categorization (need >= 0.5 to match):