The Singular Value Decomposition ( SVD) of a matrix is a central matrix decomposition method in linear algebra.It can be applied to all matrices,not only to square matrices and it always exists.It has been referred to as the 'fundamental theorem of linear algebra'( strang 1993).
SVD Theorem: Let Am×n be a rectangular matrix of rank r∈[0,min(m,n)]. The SVD of A is a decomposition of the form.
A=UΣVT
with an orthogonal matrix U∈Rm×m with column vectors ui,i=1,…,m and an orthogonal matrix V∈Rn×n with column vectors vj,j=1,…,n.Moreover, Σ is an m×n matrix with ∑ii=σ≥0 and σij=0,i≠j.
The diagonal entries Σi=1,…,r of σ are called singular values. ui are called left singular vectors, and vj are called right singular vectors.By convention singular values are ordered ie; σ1≥σ2…σr≥0.
The geometric intuition is that, SVD performs a basis change via VT followed by a scaling and augmentation(reduction) in dimensionality via the singular value matrix Σ.Finally it performs a second base change via U. The SVD expresses a change of basis in both the domain and codomain. This in contrast with the eigen decomposition that operates within the same vector space, where the same base change is applied and then undone.
Construction of SVD
The SVD of the square matrix shares some similarities with the eigen decomposition. The eigen decomposition of SPD(symmetric postive definite matrix)
S=ST=PDPT
and the SVD is
S=UΣVT
if we set U=P=V,D=Σ , we see that SVD of SPD matrices is their eigen decomposition.
Computing the SVD of A∈Rm×n is equivalent to finding two set of orthonormal basis U=(u1,…,um) and V=(v1,…,vn) of the codomain Rm and Rn respectively.From these ordered basis we will construct the matrices U and V.
Lets begin with constructing right singular vectors.The spectral theorem tells us that a symmetric matrix possesses an ONB of eigen vectors, which also means it can be diagonalized.More over we can always construct a symmetric, positive semi definite matrix ATA∈Rn×n from any rectangular matrix A∈Rm×n. Then we can always diagonalize ATA and obtain
ATA=PDPT=P[λ1⋯0⋮⋱⋮0⋯λn]PTwhere P is an orthogonal matrix, which is composed of the orthonormal eigen basis.The λi≥0 are the eigen values of ATA.Let us assume the SVD of A exist.This yields.
ATA=(UΣVT)T(UΣVT)=VΣTUTUΣVT
where U and V are orthogonal matrices.Therefore , with UTU=I, we obtain
Comparing the above two equations
VT=PT
σ2i=λi
Therefore the eigen vectors of ATA that compose P are the right singular vectors V of A.The eigen values of ATA are the squared singular values of Σ.To obtain the left singular vectors U, we follow a similar procedure.We start by computing the SVD of the symmetric matrix AAT∈Rm×m.The SVD of A yields
AAT=(UΣVT)(UΣVT)T=UΣVTVΣTUT
The orthonormal eigen vectors of AAT are the left singular vectors U and form an orthonormal basis set in the codomain of SVD. Since AAT and ATA have the same non zero eigen values, the nonzero entries of the Σ matrix of both cases have to be same.
We can connect U and V in the following way.The images of vi under A have to be orthogonal.The inner product of Avi and Avj must be 0 for i≠j.For any two orthogonal eigen vectors vi,vj,i≠j, it holds that
(Avi)T(Avj)=vTi(ATA)vj=vTi(λjvj)=λjvTivj=0
For the case m≥r, it holds that Av1,Av2,…,Avr is a basis of an r dimensional subspace of Rm.We need left singular vectors that are orthonormal. We can normalize the images of the right singular vectors Avi and obtain
ui=Avi||Avi||=1√λiAvi=1σiAvi
Therefore the eigen vectors of ATA, which we know are the right singular vectors vi, and their normalized images under A, the left singular vectors ui, form two self consistent ONBs that are connected through singular value matrix Σ.
So the singular value equation
Avi=σiui,i=1,…r
AV=UΣ
A=UΣVT, which is the SVD of A.
Calculate Singular-Value Decomposition
let us find the SVD of
A=[101−210]Original matrix A 2X2 [[2 1] [1 2]] Left Singular Vectors 2x2 [[-0.70710678 -0.70710678] [-0.70710678 0.70710678]] Singular value matrix Sigma 2x2 [3. 1.] Right singular vectors 2x2 [[-0.70710678 -0.70710678] [-0.70710678 0.70710678]]
Reconstructed Matrix [[2. 1.] [1. 2.]]
A^T.A [[5 4] [4 5]] Eigen values of A^T.A [9. 1.] Eigen vectors of A^T.A [[ 0.70710678 -0.70710678] [ 0.70710678 0.70710678]] A.A^T [[5 4] [4 5]] Eigen values of A.A^T [9. 1.] Eigen vectors of A.A^T [[ 0.70710678 -0.70710678] [ 0.70710678 0.70710678]] Reconstructed Matrix [[2. 1.] [1. 2.]]
Original matrix A 2X3 [[ 1 0 1] [-2 1 0]] Left Singular Vectors 2x2 [[ 0.4472136 0.89442719] [-0.89442719 0.4472136 ]] Singular value matrix Sigma 2x3 [2.44948974 1. ] Right singular vectors 3x3 [[ 9.12870929e-01 -3.65148372e-01 1.82574186e-01] [-4.44089210e-16 4.47213595e-01 8.94427191e-01] [-4.08248290e-01 -8.16496581e-01 4.08248290e-01]] Reconstructed Matrix using SVD [[ 1. 0. 1.] [-2. 1. 0.]]
import numpy as np
Original matrix A 2X3 [[ 3 2 2] [ 2 3 -2]] Left Singular Vectors 2x2 [[ 0.70710678 -0.70710678] [ 0.70710678 0.70710678]] Singular value matrix Sigma 2x3 [5. 3.] Right singular vectors 3x3 [[ 7.07106781e-01 7.07106781e-01 3.88578059e-16] [-2.35702260e-01 2.35702260e-01 -9.42809042e-01] [-6.66666667e-01 6.66666667e-01 3.33333333e-01]] A^T.A [[13 12 2] [12 13 -2] [ 2 -2 8]] Eigen values of A^T.A [2.5000000e+01 9.0000000e+00 5.0324328e-15] Eigen vectors of A^T.A [[-7.07106781e-01 2.35702260e-01 -6.66666667e-01] [-7.07106781e-01 -2.35702260e-01 6.66666667e-01] [-4.59701721e-17 9.42809042e-01 3.33333333e-01]] A.A^T [[17 8] [ 8 17]] Eigen values of A.A^T [25. 9.] Eigen vectors of A.A^T [[ 0.70710678 -0.70710678] [ 0.70710678 0.70710678]] Reconstructed Matrix [[-3. -2. -2.] [-2. -3. 2.]]
Comparison Eigen Value Decomposition and SVD
- The SVD always exists for any matrix Rm×n. The eigen decomposition is only defined for square matrices Rn×n and only exists if we can find a basis of eigenvectors of Rn.
- The vectors in the eigen decomposition matrix P are not necessarily orthogonal, i.e., the change of basis is not a simple rotation and scaling.On the other hand, the vectors in the matrices U and V in the SVD are orthonormal, so they do represent rotations.
- Both the eigen decomposition and the SVD are compositions of three linear mappings:
2. Independent scaling of each new basis vector and mapping from domain to codomain
3. Change of basis in the codomain
A key difference between the eigen decomposition and the SVD is that in the SVD, domain and codomain can be vector spaces of different dimensions.
- In the SVD, the left- and right-singular vector matrices U and V are generally not inverse of each other (they perform basis changes in different vector spaces). In the eigen decomposition, the basis change matrices P and P−1 are inverses of each other.
- In the SVD, the entries in the diagonal matrix Σ are all real and non negative,which is not generally true for the diagonal matrix in the eigen decomposition.
- The SVD and the eigen decomposition are closely related through their projections
- For symmetric matrices A∈Rn×n, the eigenvalue decomposition and the SVD are one and the same, which follows from the spectral theorem.
V2=[1√2−1√2]
U2=[1√52√5]
V2=[1√18−1√184√18]
V3=[−232313]
Compute the left singular vectors as the eigen basis of AAT
U1=[1√21√2]
U2=[1√2−1√2]
Compute the right singular vectors as the eigen basis of ATA
ATA=[101−110][1−10110]=[2−1−12]
V2=[1√21√2]
U1=[−2√61√6−1√6]
U2=[01√21√2]
U3=[1√31√3−1√3]
eigen values are λ1=7,λ2=3the corresponding normalized eigen vectors are
U1=[1√21√2]
U2=[1√2−1√2]
So the SVD is UΣVT
Comments
Post a Comment