You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Regularization: dealing with infinitely many solutions
Ridge regression --> adding curvature
Bias, variance, and irreducible error
Lecture 6: Cross validation
Importance of validation
Leave one out validation
K-fold validation
Choosing hyperparameters
Lecture 7: LASSO
Benefits of L1 Regularization
Coordinate Descent Algorithm
Subgradients
Norms and Convexity
Lecture 8: Logistic Regression
Logistic Regression
Classification
Introduction to Optimization
Lecture 9: Gradient Descent
Gradient Descent
Stochastic Gradient Descent
Lecture 10: Perceptrons & Support Vector Machines
Perceptrons training algorithm
Linear separability
Kernel Trick: separation by moving to higher dimensional space
Support Vector Machines (SVM)
Lecture 11: SVM and Nearest Neighbors
SVM as an optimization problem
SVM is a quadratic optimization problem
K-nearest Neighbors
Lecture 12: Kernel Trick, Bootstrap, K-means
More in-depth on Kernel trick
Commmon kernels
Kernelized ridge regression
Random Feature Trick
Building confidence intervals with Bootstrap
K-means (unsupervised learning)
Lecture 13: Principal Component Analysis
Low-rank approximations
Frame PCA as a variance-maximizing optimization problem
Lecture 14: Singular Value Decomposition
SVD
Low-rank approximations
Relation to PCA
Lecture 15: Mixture Models and Expectation Maximization
Unsupervised Learning
Probablistic Interpretation of Classification
Lecture 16/17: Neural Networks
feedforward, convolutional, recurrent
backpropogation
autodifferentiation
Lecture 18: Decision Trees, Bagging, and Boosting
Decision Trees
Bagging (bootstrap aggregation)
Random Forests
Boosting
Homework Topics
HW0 : Review
Probability review
Expectation, variance
Linear algebra review
intro to python
HW1
Maximum Likelihood Estimation (MLE)
Bias - Variance trade-off
Linear Regression
Ridge Regression
Test error and training error
HW2
Norms and Convexity
LASSO regularization - Coordinate Descent
Binary Logistic Regression
Gradient Descent & Stochastic Gradient Descent
HW3
Kernel Trick and Kernelized ridge regression
Multivariate Gaussians
K-means
Bootstrap
HW4
Expectation Maximization (Mixture Models)
Alternating Minimizationg
Low rank approximation
Pytorch and Autodifferentiation
About
Methods for designing systems that learn from data and improve with experience. Supervised learning and predictive modeling: decision trees, rule induction, nearest neighbors, Bayesian methods, neural networks, support vector machines, and model ensembles. Unsupervised learning and clustering.