Linear Regression
This is the note I used as an example of applications in Linear Algebra I lectured at Purdue University. It is slightly modified so that it is more or less self contained.
Starting from least-squares solution, we are going to give an introductory exploration on (linear) regression in this note.
import numpy as np
import sklearn.linear_model
import matplotlib.pyplot as plt
from IPython.display import set_matplotlib_formats
plt.rcParams["figure.figsize"] = (8, 6)
set_matplotlib_formats('png', 'pdf')
Least-squares solution
Let \(A\) be an \(m \times n\) matrix, and \(B\) be a vector in \(\mathbb{R}^m\). A least-squares solution to a linear system \(Ax = B\) is an \(\hat{x}\) such that \(|A \hat{x} - B| \le |A x - B|\) for all \(x\). Here, \(|x|\) is the length of the vector \(x\). If the system \(Ax = B\) is consistent, then a least-squares solution is just a solution.