SVD is a way to factorize A=UΣVT∈Rm×n into the product of three matrices, where U∈Rm×m and V∈Rn×n are orthogonal and Σ contains the singular values on its main diagonal.Instead of doing the full SVD factorization, we will now investigate how the SVD allows us to represent matrix A as a sum of simpler(low-rank) matrices Ai, which lends itself to a matrix approximation scheme that is cheaper to compute than the full SVD.
We consider a rank-1 matrix Ai∈Rm×n as
Ai=uivTi
which is formed by the outer product of the ith orthogonal column vector of U and V.
A matrix A∈Rm×n of rank r can be written as a sum of rank-1 matrices A, so that
A=∑ri=1σiuivTi=∑ri=1σiAi
where the outer-product matrices Ai are weighted by the ith singular value σi.We summed up the r individual rank-1 matrices to obtain the rank-r matrix A.If the sum does not run over all matrices Ai,i=1,…,r,but only up to an intermediate value k<r, we obtain a rank-k approximation.
ˆA(k)=∑ki=1σiuivTi=∑ki=1σiAi
of A with rk(ˆA(k))=k.
We can use SVD to reduce a rank-r matrix A to a rank-k matrix ˆA in a principled optimal manner.We can interpret the approximation of A by a rank-k matrix as a form of lossy compression. Therefore, the low rank approximation of a matrix appears in many machine learning applications, e.g., image processing, noise filtering and regularization of ill-posed problems.Furthermore, it plays a key role in dimensionality reduction and principal component analysis(PCA).
Example:
import numpy as np
from numpy.linalg import eig
from scipy.linalg import svd
# define a matrix
A = np.array([[2,1],[1,2]])
print("Original matrix A 2X2")
print(A)
# factorize
U, S, V = svd(A)
print("Left Singular Vectors 2x2")
print(U)
print("Singular value matrix Sigma 2x3")
print(S)
print("Right singular vectors 3x3")
print(V)
# create m x n Sigma matrix with zeros
Sigma =np.zeros((A.shape[0], A.shape[1]))
# populate Sigma with n x n diagonal matrix
Sigma[:A.shape[0], :A.shape[0]] = np.diag(S)
B = U.dot(Sigma.dot(V))
print("Reconstructed Matrix")
print(round(B))
u0=U[0]
v0=V[0]
u1=U[1]
v1=V[1]
print("Rank-1 Approximation using one rank-1 matrix")
print(np.outer(S[0]*u0,v0))
print("Rank-2 Approximation using two rank-1 matrix")
print(np.outer(S[1]*u1,v1)+np.outer(S[0]*u0,v0))
O/P:
Original matrix A 2X2 [[2 1] [1 2]] Left Singular Vectors 2x2 [[-0.70710678 -0.70710678] [-0.70710678 0.70710678]] Singular value matrix Sigma 2x3 [3. 1.] Right singular vectors 3x3 [[-0.70710678 -0.70710678] [-0.70710678 0.70710678]] Reconstructed Matrix [[2. 1.] [1. 2.]] Rank-1 Approximation using one rank-1 matrix [[1.5 1.5] [1.5 1.5]] Rank-2 Approximation using two rank-1 matrix [[2. 1.] [1. 2.]]
Comments
Post a Comment