Machine Learning@University of Wrocław
Welcome!
You are browsing the 2020 edition. Materials for past years are in branches: 2019.
Topic | Learning materials |
---|---|
Intro to ML: Specifying problems uding data, basic terminology | Slides: lectures/01-intro.pdf Notes: lectures/01-notes.ipynb (GitHub preview) or (nbviewer) |
Regression: hypotheses, loss functions, regularization | lectures/02-notes.ipynb (GitHub preview) or (nbviewer) |
From Statistical inference to Machine Learning | lectures/03-notes.ipynb (GitHub preview) or (nbviewer) lectures/03-notes-naive_bayes-addition.ipynb (GitHub preview) or (nbviewer) |
Logistic regression, numerical optimization, impact of loss function on regression solution | lectures/04-05-notes.ipynb (GitHub preview) or (nbviewer) |
Feature selection: wrapper methods, forawrd stagewise, L1 regularization, LARS+LASSO | lectures/06-regression_var_selection_lasso.ipynb (GitHub preview) or (nbviewer) |
Decision Trees: Building, pruning, Random Forests | lectures/07_nodes_dt.ipynb (GitHub preview) or (nbviewer) |
Boosting classifiers: AdaBoost, gradient boosting, XGBoost, Viola-Jones Face detector | lectures/09_adabost.ipynb (GitHub preview) or (nbviewer) |
Neural Networks and SVM | lectures/10_neuralnets_kernels_svm.ipynb (GitHub preview) or (nbviewer) |
Unsupervised learning: K-means, Self-Organizing maps, EM | lectures/11_kMeans_SOM.ipynb (GitHub preview) or (nbviewer) |
Probabilistic Graphical Models: intuitions about Kalman filter and HMM | lectures/14_pgm.ipynb (GitHub preview) or (nbviewer) |
Review | Slides: lectures/15-review.pdf |