Machine Learning Resources
Courses
These two are probably the best introductory courses out there right now:
-
Machine Learning de Andrew Ng, Stanford Best introductory machine learning course? Maybe a bit dated. (lots of github repos with solutions)
-
Convolutional Neural Networks de Karpathy, Stanford Best introductory machine learning course? It says convolutional but the first half deals with machine learning. It’s really up to date wrt the state of the art in neural networks. (lots of github repos with solutions) link to stanford version of the course
Books
Introductory books
-
Learning from Data, by Abu-Mostafa, 2012 Really short and to the point, a great intro to machine learning from the statistical learning theory perspective (specially good for svm), introducing basic concepts such as overfitting, testing/validation sets, cross validation, model selection, supervised vs non supervised, etc.
-
Information theory, inference and Learning, by McKay, 2003 Great book for self study, information theory chapters can be skipped, it is a bit general but great for understanding probabilitic and bayesian models, although requires a bit of math saaviness.
-
Neural Networks and Deep Learning, by Nielsen, 2015, (free online). Best introduction by far to Neural Networks (feedforward + convolutional). Easy, free and short.
-
Probabilistic Programming & Bayesian Methods for Hackers, various authors, 2015/2016, (free online, open source) Bayesian and probabilitic models for programmers (easy math!).
-
Deep Learning Book, by Goodfellow and Bengio, 2016, (free online) This is not really “introductory” in the sense of being easy to follow or having few prerequisites, but it is a great intro if you want to work on improving current neural network models.
Classical reference books
These books are sort of traditional, and aren’t designed for self study, I’d advise you to use them as a reference. They are sorted from easy to difficult. Even though all books cover similar topics, they have different approaches:
- Pattern classification/recognition: more signal processing
- Statistical Learning/Inference/Probabilistic/Graphical Models: More bayesian/statistical models
-
Machine Learning/Learning From data: a bit more agnostic, more “pure learning” algorithms
-
Machine Learning, by Tom Mitchell This is sort of THE classical textbook reference for machine learning stuff. I’ve read mixed opinions about self-study.
-
Pattern Recognition, by Theodoridis, 2008 Similar to Bishop’s
-
Pattern Recognition and Machine Learning, by Bishop A sequel to Duda’s book, a bit more updated and of similar difficulty.
-
Pattern Classification, by Duda and Hart, 2000 The classic pattern classification book. A bit dated right now but great for reference.
-
Machine Learning: a Probabilistic Perspective, by Keving Murphy, 2014. Reputedly difficult and not for self-study, but updated.
-
Probabilistic graphical models, by Koller, 2009 Great book but a bit disorganized. Also difficult, not recommended for self-study.
- The elements of statistical Learning, by Hastie, 2001 Famous for being terse and difficult, not recommended for self-study.
Writing papers & stuff
Talks/presentations
- Giving a Talk, Peyton Jones
- Giving a Talk, Peyton Jones (video)
- The cognitive style of powerpoint, Edward Tufte
Blogs & Forums
Podcasts
- Machine Learning Street Talk
- The talking machines
- Linear Digressions
- The Data Skeptic
- Partially Derivative
- Learning Machines 101
- Data Science Podcasts list
Newsletters & Mailing Lists
Starting out
If you are starting out in machine learning, focusing on neural networks the recommended path to take would be:
- General knowledge (this can easily take 6 months or more)
- Suscribe to Connectionists and Uncertainty in AI
- Start listening to some podcasts, they are mostly introductory and enable you to quickly get a superficial knowledge of various subjects and get to know some research groups in ml.
- Take Andrew Ng and Karphaty’s online courses (in that order). Do all the homework/quizzes.
- While doing Andrew Ng’s course, read Learning from Data, by Abu-Mostafa.
- While doing Karpathy’s course, read Neural Networks and Deep Learning, by Nielsen
- Specific neural networks stuff
- Read Bengios Book on deep learning
- Learn how to use a deep learning framework such as Torch/Tensor Flow/Caffe/etc. It seems Keras (built on top of Tensor Flow) is a good choice.
- Take a course/read a book on bayesian inference/probabilitic models
- Take a problem with a few datasets (maybe from a kaggle competition) and a model a try to improve its performance.
- Checkout papers from NIPS (one of the best ml/neural nets conferences)
Youtube Channels
- Machine Learning Street Talk
- Henry AI Labs
- Two minute papers
- Yannic Kilcher
- Machine Learning Dojo with Tim Scarfe
- WelcomeAIOverlords
- DotCSV (español)
Conference orals
- CVPR
- NeurIPS (ex NIPS)
- ICLR