Viewing a single comment thread. View all comments

wolfium t1_j2xjeyb wrote

Sounds very related to https://en.wikipedia.org/wiki/Ridge_regression (adding a constant times the identity matrix)

3

comradeswitch t1_j33qmla wrote

Yes, ridge regression and the more general Tikhonov regularization can be obtained by setting up an optimization problem:

min_X ||AX - Y||^2 + c/2 ||X||^2

Taking gradient wrt X and rearranging, we get (A^T A + c I)X = A^T Y

A matrix is psd iff it can be written as X = B^T B for some matrix B, and is characterized by having nonnegative eigenvalues. And if Xv = lambda v, then (X+cI)v = (lambda + c)v and so v is still an eigenvector but c has been added to the eigenvalue. For a psd matrix, the smallest eigenvalue is at least 0, so for positive c, the matrix is strictly positive definite and therefore invertible.

It may also be approached from a probabilistic modelling standpoint, treating the regularization as a normal prior on the solution with zero mean and precision cI.

2