Machine Learning, Dynamical Systems and Control

All of machine learning revolves around optimization. This includes regression and model selection frameworks that aim to provide parsimonious and interpretable models for data. Curve fitting is the most basic of regression techniques, with polynomial and exponential fitting resulting in solutions that come from solving linear systems of equations. This can be generalized for fitting of nonlinear systems. Importantly, regression are typically applied to under- or over-determined systems, thus requiring cross-validation strategies to evaluate the results.

 

Section 4.1: Classic Curve Fitting and Least-Squares Regression

Stacks Image 10

 

[ View ]

Section 4.2: Nonlinear Regression and Gradient Descent Algorithm

Stacks Image 19

 

[ View ]

 

Section 4.3: Over- and Under-determined Systems

Stacks Image 63

 

[ View ]

Section 4.4: Optimization for Regressions

Stacks Image 72

 

[ View ]

 

Section 4.5: The Pareto Front and Parsimonious Models

Stacks Image 86

 

[ View ]

Section 4.6: Model Selection and Cross-Validation

Stacks Image 95

 

[ View ]

 

Section 4.7: Model Selection and Information Criteria

Stacks Image 105

 

[ View ]

 

Supplementary Videos

 

Stacks Image 43
This video highlights some of the basic ideas of least-square regressions, and especially polynomial fitting to data [ View ]

 

Stacks Image 56
This video highlights how to develop and implement a gradient descent algorithm [ View ]