Course on Deep Learning on the fingers

I still do not fully understand how it happened, but last year I signed word for word to give a course on Deep Learning and, surprisingly, I read it. I promised - I spread it!

The course does not pretend to be complete, but rather it is a way to play hands with the main areas where deep learning has established itself as a practical tool, and to get a sufficient base to freely read and understand modern articles.

The course materials were tested on students of the department of AFI Novosibirsk State University , so there is a chance that you can really learn something from them.



The course requires:

- Knowledge of mathematics at the level of the first or second year of university: you need to know a little bit of probability theory, linear algebra, the fundamentals of mathematical analysis and the analysis of functions of many variables. If all this has passed you, here are all the necessary courses from MIT and Harvard. They typically have enough to go through the first two sections.
- Programming skills in python.

In a good course, lectures, exercises, and a place to ask questions and discuss them should be available. Here they are collected from the world on a thread:

- Lectures exist as recordings on Youtube .
- As an exercise, you can use the tasks of the magnificent Stanford courses on DeepLearning ( CS231n and CS224n ), I will write below which ones specifically.
- You can discuss and ask at ClosedCircles and ODS.ai.

Lectures and exercises


Lecture 1: Introduction
Lecture 2: Linear Classifier
Lecture 2.1: Softmax

Exercise: the k-Nearest Neighbor and Softmax classifier sections from here
According to the specifics of the task, these lecture notes can help.

Lecture 3: Neural networks. Backpropagation
Lecture 4: Neural networks in detail

Exercise: “Two-Layer Neural Network” sections from here and “Fully-connected Neural Network” from here.

Lecture 5: Convolutional Neural Networks (CNN)
Lecture 6: Libraries for deep learning

Exercise: Convolutional Networks and PyTorch on CIFAR-10 sections from here

Lecture 7: Other Computer Vision Tasks
Lecture 8: Introduction to NLP. word2vec

Exercise: section "word2vec" here

Lecture 9: Recurrent Neural Networks (RNN)
Lecture 10: Machine Translation, Seq2Seq, Attention

I didn’t find a good finished quest here, but you can implement PyTorch Char-RNN from the famous post Andrej Karpathy and set Shakespeare on fire .

Lecture 11: Introduction to reinforcement learning (RL), basic algorithms
Lecture 12: Examples of the use of RL. Alpha (Go) Zero.
Lecture 13: Neural networks in 2018.

Where to discuss and ask questions


All questions on the course can be set to me personally or discussed in the #data circle on ClosedCircles.com ( here is an invite ).
In addition, tasks can be discussed in the channel # class_cs231n on ODS.ai , there will help. To do this, you have to get there an invite yourself, send applications.

Well, in general, call, write, always happy.

The most enjoyable section - thanks!


First of all, thank you so much buriy , with whom we prepared the course. Thanks to the native department , which gave such an opportunity at all.

Everyone in the ODS.ai and ClosedCircles get-togethers, who helped in the preparation, answered the questions, sent a feedback, reminded me that I had to put everything in, and so on.

Finally, everyone who watched streams on the channel, asked questions in real-time and in general created the feeling that I was not talking to the wall.

From the heart.

Source: https://habr.com/ru/post/414165/


All Articles