abstract: After a very quick historical introduction to artificial intelligence, I will describe what a neural network is and how she can learn, with a particular emphasis on deep neural networks. After these concepts are digested, I will describe few examples from the zoo of neural networks. If time will permit, I will describe in details how AlphaGo works and its underlying neural network structure. The focus on these 4-hours crash-course will be on what is missing in the theory from a mathematician point of view.
What you can expect from this 4h course? Learning the meaning of basic terms and concepts like perceptron, neuron, weights, activation function, neural network, shallow and deep neural network, visible and hidden layer, training, gradient descent algorithm, backpropagation algorithm, convolutional layer, deep residual layer, fully connected layer, TensorFlow; understanding in details a case study - the AlphaGo software - that in some sense started this new era of artificial intelligence; getting an idea of what the contribution of a mathematician in this field could be.
What you should not expect from this 4h course? Becoming an expert in artificial intelligence andor machine learning andor neural networks; becoming able to effortlessly use TensorFlow; understanding in details every type of neural networks; understanding in details what are the blanks for mathematicians to fill in.
SLIDES with exercises and bibliography
Bibliography.
"Perceptrons", Minsky and Papert.
"From algebra to computational algorithms", Sprecher.
"Machine Learning", Mitchell.
"Deep Learning", Goodfellow and Bengio and Courville.