Skip to main content

2.14 Singular Value Decomposition

The Singular Value Decomposition ( SVD) of a matrix is a central matrix decomposition method in linear algebra.It can be applied to all matrices,not only to square matrices and it always exists.It has been referred to as the 'fundamental theorem of linear algebra'( strang 1993).

SVD Theorem: Let $A^{m \times n}$ be a rectangular matrix of rank $r \in [0,min(m,n)]$. The SVD of A is a decomposition of the form.

$A= U \Sigma V^T $


with an orthogonal matrix $U \in \mathbb{R}^{m \times m}$ with column vectors $u_i, i=1,\ldots,m$ and an orthogonal matrix $V \in \mathbb{R}^{n \times n}$ with column vectors $v_j, j=1,\ldots,n$.Moreover, $\Sigma$ is an $m \times n$ matrix with $\sum_{ii} = \sigma \ge 0$ and $\sigma_{ij}=0, i \ne j$.

The diagonal entries $\Sigma_i=1,\ldots,r$ of $\sigma$ are called singular values. $u_i$ are called left singular vectors, and $v_j$ are called right singular vectors.By convention singular values are ordered ie; $\sigma_1 \ge \sigma_2 \ldots \sigma_r \ge 0$.

The geometric intuition is that, SVD performs a basis change via $V^T$ followed by a scaling and augmentation(reduction) in dimensionality via the singular value matrix $\Sigma$.Finally it performs a second base change via $U$. The SVD expresses a change of basis in both the domain and codomain. This in contrast with the eigen decomposition that operates within the same vector space, where the same base change is applied and then undone. 



Construction of SVD

The SVD of the square matrix shares some similarities with the eigen decomposition. The eigen decomposition of SPD(symmetric postive definite matrix)

$S=S^T=PDP^T$

and the SVD is

$S=U \Sigma V^T$

if we set $U=P=V,  D=\Sigma$ , we see that SVD of SPD matrices is their eigen decomposition.

Computing the SVD of $A \in \mathbb{R}^{m \times n}$ is equivalent to finding two set of orthonormal basis $U=(u_1,\ldots,u_m)$ and $V=(v_1,\ldots,v_n)$ of the codomain $\mathbb{R}^m$ and $\mathbb{R}^n$ respectively.From these ordered basis we will construct the matrices $U$ and $V$.

Lets begin with constructing right singular vectors.The spectral theorem tells us that a symmetric matrix possesses an ONB of eigen vectors, which also means it can be diagonalized.More over we can always construct a symmetric, positive semi definite matrix $A^TA \in \mathbb{R}^{n \times n}$ from any rectangular matrix $ A \in \mathbb{R}^{m \times n}$. Then we can always diagonalize $A^TA$ and obtain

$A^TA=PDP^T=P\begin{bmatrix}
\lambda_1 & \cdots & 0\\
\vdots & \ddots & \vdots\\
0& \cdots & \lambda_n
\end{bmatrix}P^T$

where $P$ is an orthogonal matrix, which is composed of the orthonormal eigen basis.The $\lambda_i \ge 0$ are the eigen values of $A^TA$.Let us assume the SVD of A exist.This yields.

$A^TA=(U\Sigma V^T)^T(U\Sigma V^T)=V\Sigma^TU^TU\Sigma V^T$

where $U$ and $V$ are orthogonal matrices.Therefore , with $U^TU=I$, we obtain


Comparing the above two equations

$V^T=P^T$

$\sigma_i^2=\lambda_i$

Therefore the eigen vectors of $A^TA$ that compose $P$ are the right singular vectors $V$ of $A$.The eigen values of $A^TA$ are the squared singular values of $\Sigma$.To obtain the left singular vectors $U$, we follow a similar procedure.We start by computing the SVD of the symmetric matrix $AA^T \in \mathbb{R}^{m \times m}$.The SVD of $A$ yields

$AA^T=(U\Sigma V^T)(U \Sigma V^T)^T=U\Sigma V^T V \Sigma ^T U^T$

The orthonormal eigen vectors of $AA^T$ are the left singular vectors $U$ and form an orthonormal basis set in the codomain of SVD. Since $AA^T$ and $A^TA$ have the same non zero eigen values, the nonzero entries of the $\Sigma$ matrix of both cases have to be same.

We can connect $U$ and $V$ in the following way.The images of $v_i$ under $A$ have to be orthogonal.The inner product of $Av_i$ and $Av_j$ must be 0 for $i \ne j$.For any two orthogonal eigen vectors $v_i,v_j, i \ne j$, it holds that

$(Av_i)^T(Av_j)=v_i^T(A^TA)v_j=v_i^T(\lambda_jv_j)=\lambda_jv_i^Tv_j=0$

For the case $m \ge r$, it holds that ${Av_1,Av_2,\ldots,Av_r}$ is a basis of an $r$ dimensional subspace of $\mathbb{R}^m$.We need left singular vectors that are orthonormal. We can normalize the images of the right singular vectors $Av_i$ and obtain

$u_i=\frac{Av_i}{||Av_i||}=\frac{1}{\sqrt{\lambda_i}}Av_i=\frac{1}{\sigma_i}Av_i$

Therefore the eigen vectors of $A^TA$, which we know are the right singular vectors $v_i$, and their normalized images under $A$, the left singular vectors $u_i$, form two self consistent  ONBs that are connected through singular value matrix $\Sigma$.

So the singular value equation

$Av_i=\sigma_i u_i, i=1,\ldots r$

$AV=U \Sigma$

$A=U\Sigma V^T$, which is the SVD of  $A$.





Calculate Singular-Value Decomposition

let us find the SVD of  

$A=\begin{bmatrix}
1 & 0 & 1\\-2 &1 & 0 \\
\end{bmatrix}$

The SVD requires us to compute the right-singular vectors $v_j$, the singular values $\sigma_k$, and the left-singular vectors $u_i$.
step1: Compute Right-singular vectors as the eigen basis of $A^TA$
$A^TA=\begin{bmatrix}
1 & -2 \\ 0 & 1 \\ 1 & 0 \\
\end{bmatrix}
\begin{bmatrix}
1 & 0 & 1\\-2 &1 & 0 \\
\end{bmatrix}=\begin{bmatrix}
5 & -2 & 1\\-2 &1 & 0 \\ 1 & 0 & 1\\
\end{bmatrix}$

We compute the singular values and right singular vectors $v_j$ through the eigen value decomposition of $A^TA$ which is given by
$A^TA=\begin{bmatrix}
\frac{5}{\sqrt{30}} & 0 & \frac{ -1}{\sqrt{6}} \\
\frac{-2}{\sqrt{30}} & \frac{1}{\sqrt{5}}& \frac{-2}{\sqrt{6}} \\
\frac{1}{\sqrt{30}} & \frac{2}{\sqrt{5}}& \frac{1}{\sqrt{6}}\\
\end{bmatrix}\begin{bmatrix}6 & 0 & 0 \\ 0 & 1 &0 \\ 0 & 0 &0 \\
\end{bmatrix}\begin{bmatrix}\frac{5}{\sqrt{30}} & \frac{-2}{\sqrt{30}}& \frac{1}{\sqrt{30}} \\
0 & \frac{1}{\sqrt{5}}& \frac{2}{\sqrt{5}} \\
\frac{-1}{\sqrt{6}} & \frac{-2}{\sqrt{6}}& \frac{1}{\sqrt{6}}\\
\end{bmatrix}=PDP^T$

The right singular vectors are 
$V=P=\begin{bmatrix}\frac{5}{\sqrt{30}} & 0 & \frac{ -1}{\sqrt{6}} \\
\frac{-2}{\sqrt{30}} & \frac{1}{\sqrt{5}}& \frac{-2}{\sqrt{6}} \\
\frac{1}{\sqrt{30}} & \frac{2}{\sqrt{5}}& \frac{1}{\sqrt{6}}\\
\end{bmatrix}$

step2:compute singular value matrix
As the singular values $\sigma_i$ are the square roots of the eigenvalues of $A^TA$, we obtain them straight from $D$.
$\Sigma=\begin{bmatrix}
\sqrt{6} & 0 & 0 \\
0 & 1 & 0
\end{bmatrix}$

step3:Compute Left singular vectors by the eigen decomposition of $AA^T$ or by finding the normalized images of the right singular vectors
$u_1=\frac{1}{\sigma_1}Av_1=\frac{1}{\sqrt{6}}\begin{bmatrix}
1 & 0 & 1 \\
-2 & 1 & 0
\end{bmatrix}
\begin{bmatrix}
\frac{5}{\sqrt{30}} \\
\frac{-2}{\sqrt{30}}\\
\frac{1}{\sqrt{30}}
\end{bmatrix}=\begin{bmatrix}
\frac{1}{\sqrt{5}}\\
\frac{-2}{\sqrt{5}}
\end{bmatrix}$

$u_2=\frac{1}{\sigma_2}Av_2=\frac{1}{1}\begin{bmatrix}
1 & 0 & 1 \\
-2 & 1 & 0
\end{bmatrix}
\begin{bmatrix}
0 \\
\frac{1}{\sqrt{5}}\\
\frac{2}{\sqrt{5}}
\end{bmatrix}=\begin{bmatrix}
\frac{2}{\sqrt{5}}\\
\frac{1}{\sqrt{5}}
\end{bmatrix}$
$U=[u_1,u_2]=\frac{1}{\sqrt{5}}\begin{bmatrix}1 & 2\\
-2 & 1
\end{bmatrix}$

The SVD can be calculated by calling the svd() function in Python . The function takes a matrix and returns the U,  Ć© and VT elements. The Ć© diagonal matrix is returned as a vector of singular values. The V matrix is returned in a transposed form, e.g.VT . 

Example: Python code
find the SVD of the matrix $\begin{bmatrix}
2 & 1\\
1 & 2
\end{bmatrix}$

# singular-value decomposition
import numpy as np
from numpy.linalg import eig
from scipy.linalg import svd
# define a matrix
A = np.array([[2,1],[1,2]])
print("Original matrix A 2X2")
print(A)
# factorize
U, S, V = svd(A)
print("Left Singular Vectors 2x2")
print(U)
print("Singular value matrix Sigma 2x3")
print(S)
print("Right singular vectors 3x3")
print(V)
print("Reconstructed Matrix")
sigma=np.diag(S)
print(U.dot(sigma.dot(V)))

print("A^T.A")
AR=(A.T).dot(A)
print(AR)
evalr,evecr=eig(AR)
print("Eigen values of A^T.A")
print(evalr)
print("Eigen vectors of A^T.A")
print(evecr)
print("A.A^T")
AL=A.dot(A.T)
print(AL)
evall,evecl=eig(AL)
print("Eigen values of A.A^T")
print(evall)
print("Eigen vectors of A.A^T")
print(evecl)
print("Reconstructed Matrix")
D=np.diag(np.sqrt(evall))
print(evecl.dot(D).dot(evecr.T))

O/P:
Original matrix A 2X2
[[2 1]
 [1 2]]
Left Singular Vectors 2x2
[[-0.70710678 -0.70710678]
 [-0.70710678  0.70710678]]
Singular value matrix Sigma 2x2
[3. 1.]
Right singular vectors 2x2
[[-0.70710678 -0.70710678]
 [-0.70710678  0.70710678]]
Reconstructed Matrix
[[2. 1.]
 [1. 2.]]

A^T.A
[[5 4]
 [4 5]]
Eigen values of A^T.A
[9. 1.]
Eigen vectors of A^T.A
[[ 0.70710678 -0.70710678]
 [ 0.70710678  0.70710678]]
A.A^T
[[5 4]
 [4 5]]
Eigen values of A.A^T
[9. 1.]
Eigen vectors of A.A^T
[[ 0.70710678 -0.70710678]
 [ 0.70710678  0.70710678]]
Reconstructed Matrix
[[2. 1.]
 [1. 2.]]


import numpy as np
from numpy.linalg import eig
from scipy.linalg import svd
# define a matrix
A = np.array([[1, 0,1],[-2, 1,0]])
print("Original matrix A 2X3")
print(A)
# factorize
U, S, V = svd(A)
print("Left Singular Vectors 2x2")
print(U)
print("Singular value matrix Sigma 2x3")
print(S)
print("Right singular vectors 3x3")
print(V)
# create m x n Sigma matrix with zeros
Sigma =np.zeros((A.shape[0], A.shape[1]))
# populate Sigma with n x n diagonal matrix
Sigma[:A.shape[0], :A.shape[0]] = np.diag(S)
B = U.dot(Sigma.dot(V))
print("Reconstructed Matrix using SVD")
print(round(B))


O/P:
Original matrix A 2X3
[[ 1  0  1]
 [-2  1  0]]
Left Singular Vectors 2x2
[[ 0.4472136   0.89442719]
 [-0.89442719  0.4472136 ]]
Singular value matrix Sigma 2x3
[2.44948974 1.        ]
Right singular vectors 3x3
[[ 9.12870929e-01 -3.65148372e-01  1.82574186e-01]
 [-4.44089210e-16  4.47213595e-01  8.94427191e-01]
 [-4.08248290e-01 -8.16496581e-01  4.08248290e-01]]
Reconstructed Matrix using SVD
[[ 1.  0.  1.]
 [-2.  1.  0.]]

import numpy as np 
from numpy.linalg import eig
 from scipy.linalg import svd 
# define a matrix
 A = np.array([[3,2,2],[2,3,-2]])
 print("Original matrix A 2x3") 
print(A) 
# factorize 
U, S, V = svd(A) 
print("Left Singular Vectors 2x2")
 print(U) 
print("Singular value matrix Sigma 2x3") 
print(S) 
print("Right singular vectors 3x3") 
print(V) 
print("A^T.A") 
AR=(A.T).dot(A) 
print(AR) 
evalr,evecr=eig(AR)
 idx = evalr.argsort()[::-1] 
evalr = evalr[idx] 
evecr = evecr[:,idx] 
print("Eigen values of A^T.A") 
print(evalr) 
print("Eigen vectors of A^T.A")
 print(evecr) 
print("A.A^T") 
AL=A.dot(A.T) 
print(AL) 
evall,evecl=eig(AL)
 idx = evall.argsort()[::-1] 
evall = evall[idx] 
evecl = evecl[:,idx] 
print("Eigen values of A.A^T") 
print(evall) 
print("Eigen vectors of A.A^T") 
print(evecl) 
print("Reconstructed Matrix") 
D=np.zeros((A.shape[0],A.shape[1])) 
D[:A.shape[0], :A.shape[0]] =np.diag(np.sqrt(evall)) 
print(evecl.dot(D).dot(evecr.T))

O/P:
Original matrix A 2X3
[[ 3  2  2]
 [ 2  3 -2]]
Left Singular Vectors 2x2
[[ 0.70710678 -0.70710678]
 [ 0.70710678  0.70710678]]
Singular value matrix Sigma 2x3
[5. 3.]
Right singular vectors 3x3
[[ 7.07106781e-01  7.07106781e-01  3.88578059e-16]
 [-2.35702260e-01  2.35702260e-01 -9.42809042e-01]
 [-6.66666667e-01  6.66666667e-01  3.33333333e-01]]
A^T.A
[[13 12  2]
 [12 13 -2]
 [ 2 -2  8]]
Eigen values of A^T.A
[2.5000000e+01 9.0000000e+00 5.0324328e-15]
Eigen vectors of A^T.A
[[-7.07106781e-01  2.35702260e-01 -6.66666667e-01]
 [-7.07106781e-01 -2.35702260e-01  6.66666667e-01]
 [-4.59701721e-17  9.42809042e-01  3.33333333e-01]]
A.A^T
[[17  8]
 [ 8 17]]
Eigen values of A.A^T
[25.  9.]
Eigen vectors of A.A^T
[[ 0.70710678 -0.70710678]
 [ 0.70710678  0.70710678]]
Reconstructed Matrix
[[-3. -2. -2.]
 [-2. -3.  2.]]






Comparison Eigen Value Decomposition and SVD
  • The SVD always exists for any matrix $\mathbb{R}^{m \times n}$. The eigen decomposition is only defined for square matrices $\mathbb{R}^{n \times n}$ and only exists if we can find a basis of eigenvectors of $\mathbb{R}^n$.

  • The vectors in the eigen decomposition matrix $P$ are not necessarily orthogonal, i.e., the change of basis is not a simple rotation and scaling.On the other hand, the vectors in the matrices $U$ and $V$ in the SVD are orthonormal, so they do represent rotations.
  • Both the eigen decomposition and the SVD are compositions of three linear mappings:
    1. Change of basis in the domain
    2. Independent scaling of each new basis vector and mapping from domain to codomain
    3. Change of basis in the codomain
    A key difference between the eigen decomposition and the SVD is that in the SVD, domain and     codomain can be vector spaces of different dimensions.

  • In the SVD, the left- and right-singular vector matrices $U$ and $V$ are generally not inverse of each other (they perform basis changes in different vector spaces). In the eigen decomposition, the basis change matrices $P$ and $P^{-1}$ are inverses of each other.
  • In the SVD, the entries in the diagonal matrix $\Sigma$ are all real and non negative,which is not generally true for the diagonal matrix in the eigen decomposition.
  • The SVD and the eigen decomposition are closely related through their projections
        – The left-singular vectors of $A$ are eigenvectors of $AA^T$
            – The right-singular vectors of $A$ are eigenvectors of $A^TA$.
             – The nonzero singular values of $A$ are the square roots of the nonzero eigenvalues of $AA^T$ and are equal to the nonzero eigenvalues of $A^TA$.
      • For symmetric matrices $A \in \mathbb{R}^{n\times n}$, the eigenvalue decomposition and the SVD are one and the same, which follows from the spectral theorem.

      Example ( University Question)
      Find the SVD of  
      $\begin{bmatrix}
      2 & 2\\
      -1 & -1
      \end{bmatrix}$

      Compute the right singular vectors as the eigen basis of $A^TA$

      $A^TA=\begin{bmatrix}
      2 & -1\\
      2 & -1
      \end{bmatrix}\begin{bmatrix}
      2 & 2\\
      -1 & -1
      \end{bmatrix}=\begin{bmatrix}
      5 & 5\\
      5 & 5
      \end{bmatrix}$

      Eigen values and eigen vectors of $A^TA$ are

      eigen values are $\lambda_1=10,\lambda_2=0$
      the corresponding normalized eigen vectors are 

      $V_1=\begin{bmatrix}
      \frac{1}{\sqrt{2}}\\
      \frac{1}{\sqrt{2}}
      \end{bmatrix}$
      $V_2=\begin{bmatrix}
      \frac{1}{\sqrt{2}}\\
      \frac{-1}{\sqrt{2}}
      \end{bmatrix}$

      Compute the left singular vectors as the eigen basis of $A.A^T$
      $A.A^T=\begin{bmatrix}
      2 & 2\\
      -1 & -1
      \end{bmatrix}\begin{bmatrix}
      2 & -1\\
      2 & -1
      \end{bmatrix}=\begin{bmatrix}
      8 & -4\\
      -4 & 2
      \end{bmatrix}$

      Eigen values and eigen vectors of $AA^T$ are

      eigen values are $\lambda_1=10,\lambda_2=0$
      the corresponding normalized eigen vectors are 

      $U_1=\begin{bmatrix}
      \frac{2}{\sqrt{5}}\\
      \frac{-1}{\sqrt{5}}
      \end{bmatrix}$
      $U_2=\begin{bmatrix}
      \frac{1}{\sqrt{5}}\\
      \frac{2}{\sqrt{5}}
      \end{bmatrix}$

      So the SVD is
      $\begin{bmatrix}
      2 & 2\\
      -1 & -1
      \end{bmatrix}=\begin{bmatrix}
      \frac{2}{\sqrt{5}} &\frac{1}{\sqrt{5}}\\
      \frac{-1}{\sqrt{5}} &\frac{2}{\sqrt{5}}
      \end{bmatrix}
      \begin{bmatrix}
      \sqrt{10} & 0\\
      0& 0
      \end{bmatrix}
      \begin{bmatrix}
      \frac{1}{\sqrt{2}} &\frac{1}{\sqrt{2}}\\
      \frac{1}{\sqrt{2}} &\frac{-1}{\sqrt{2}}
      \end{bmatrix}$

      Example ( University Question)
      Find the SVD of  
      $\begin{bmatrix}
      3&2 & 2\\
      2&3 & -2
      \end{bmatrix}$

      Compute the right singular vectors as the eigen basis of $A^TA$

      $A^TA=\begin{bmatrix}
      3 & 2 \\
      2 & 3\\
      2&-2
      \end{bmatrix}\begin{bmatrix}
      3 & 2 & 2\\
      2 & 3 & -2
      \end{bmatrix}=\begin{bmatrix}
      13 & 12 & 2\\
      12 & 13 & -2 \\
      2 & -2 & 8
      \end{bmatrix}$

      Eigen values and eigen vectors of $A^TA$ are

      eigen values are $\lambda_1=25,\lambda_2=9,\lambda_3=0$
      the corresponding normalized eigen vectors are 
      $V_1=\begin{bmatrix}
      \frac{1}{\sqrt{2}}\\
      \frac{1}{\sqrt{2}} \\
      0
      \end{bmatrix}$
      $V_2=\begin{bmatrix}
      \frac{1}{\sqrt{18}}\\
      \frac{-1}{\sqrt{18}}\\
      \frac{4}{\sqrt{18}}
      \end{bmatrix}$
      $V_3=\begin{bmatrix}
      \frac{-2}{3}\\
      \frac{2}{3}\\
      \frac{1}{3}
      \end{bmatrix}$

      Compute the left singular vectors as the eigen basis of $AA^T$
      $AA^T=\begin{bmatrix}
      3 & 2 & 2\\
      2 & 3 & -2
      \end{bmatrix}\begin{bmatrix}3 & 2 \\
      2 & 3\\
      2&-2
      \end{bmatrix}=\begin{bmatrix}
      17 & 8\\
      8 & 17 \\
      \end{bmatrix}$

      eigen values are $\lambda_1=25,\lambda_2=9$
      the corresponding normalized eigen vectors are
      $U_1=\begin{bmatrix}
      \frac{1}{\sqrt{2}}\\
      \frac{1}{\sqrt{2}}
      \end{bmatrix}$
      $U_2=\begin{bmatrix}
      \frac{1}{\sqrt{2}}\\
      \frac{-1}{\sqrt{2}}
      \end{bmatrix}$

      So the SVD is $U \Sigma V^T$
      $\begin{bmatrix}
      3 & 2 & 2\\
      2 & 3 & -2
      \end{bmatrix}=\begin{bmatrix}
      \frac{1}{\sqrt{2}} &\frac{1}{\sqrt{2}}\\
      \frac{1}{\sqrt{2}} &\frac{-1}{\sqrt{2}}
      \end{bmatrix}
      \begin{bmatrix}
      \sqrt{25} & 0 & 0\\
      0& \sqrt{9} & 0
      \end{bmatrix}
      \begin{bmatrix}
      \frac{1}{\sqrt{2}} &\frac{1}{\sqrt{2}} & 0\\
      \frac{1}{\sqrt{18}} &\frac{-1}{\sqrt{18}} & \frac{4}{\sqrt{18}} \\
      \frac{-2}{3} &\frac{2}{3} & \frac{1}{3} \\
      \end{bmatrix}$

      Example ( University Question)
      Find the SVD of  
      $\begin{bmatrix}
      1&-1\\
      0 & 1 \\
      1 & 0
      \end{bmatrix}$
      Compute the right singular vectors as the eigen basis of $A^TA$
      $A^TA=\begin{bmatrix}
      1 & 0 & 1 \\
      -1 & 1 & 0\\
      \end{bmatrix}\begin{bmatrix}
      1 & -1 \\
      0 & 1 \\
      1& 0
      \end{bmatrix}=\begin{bmatrix}
      2 & -1\\
      -1 & 2 \\
      \end{bmatrix}$

      Eigen values and eigen vectors of $A^TA$ are

      Eigen values are $\lambda_1=3,\lambda_2=1$
      the corresponding normalized eigen vectors are 
      $V_1=\begin{bmatrix}
      \frac{-1}{\sqrt{2}}\\
      \frac{1}{\sqrt{2}} \\
      \end{bmatrix}$
      $V_2=\begin{bmatrix}
      \frac{1}{\sqrt{2}}\\
      \frac{1}{\sqrt{2}}\\
      \end{bmatrix}$

      Compute the left singular vectors as the eigen basis of $AA^T$
      $AA^T=\begin{bmatrix}
      1& -1\\
      0 &1 \\
      1 & 0
      \end{bmatrix}\begin{bmatrix}
      1 &0 & 1 \\
      -1 & 1 &   0\\
      \end{bmatrix}=\begin{bmatrix}
      2 & -1 & 1\\
      -1 & 1 & 0 \\
      1 &0 & 1 \\
      \end{bmatrix}$

      eigen values are $\lambda_1=3,\lambda_2=1,\lambda_3=0$
      the corresponding normalized eigen vectors are
      $U_1=\begin{bmatrix}
      \frac{-2}{\sqrt{6}}\\
      \frac{1}{\sqrt{6}}\\
      \frac{-1}{\sqrt{6}}
      \end{bmatrix}$
      $U_2=\begin{bmatrix}
      {0}\\
      \frac{1}{\sqrt{2}}\\
      \frac{1}{\sqrt{2}}
      \end{bmatrix}$
      $U_3=\begin{bmatrix}
      \frac{1}{\sqrt{3}}\\
      \frac{1}{\sqrt{3}}\\
      \frac{-1}{\sqrt{3}}\\
      \end{bmatrix}$

      So the SVD is $U \Sigma V^T$
      $\begin{bmatrix}
      3 & 2 & 2\\
      2 & 3 & -2
      \end{bmatrix}=\begin{bmatrix}
      \frac{-2}{\sqrt{6}} & 0 &\frac{1}{\sqrt{3}}\\
      \frac{1}{\sqrt{6}} &\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{3}} \\
      \frac{-1}{\sqrt{6}} &\frac{1}{\sqrt{2}} & \frac{-1}{\sqrt{3}} \\
      \end{bmatrix}
      \begin{bmatrix}
      \sqrt{3} & 0 \\
      0 & 1 \\
      0 & 0 \\
      \end{bmatrix}
      \begin{bmatrix}
      \frac{-1}{\sqrt{2}} &\frac{1}{\sqrt{2}} \\
      \frac{1}{\sqrt{2}} &\frac{1}{\sqrt{2}} \\
      \end{bmatrix}$

      Example ( University Question)
      Find the SVD of  
      $\begin{bmatrix}
      2&0 &1\\
      1&-2 & 0
      \end{bmatrix}$

      Compute the right singular vectors as the eigen basis of $A^TA$

      $A^TA=\begin{bmatrix}
      2 &1 \\
      0 & -2\\
      1&0
      \end{bmatrix}\begin{bmatrix}
      2 & 0 & 1\\
      1 & -2 & 0
      \end{bmatrix}=\begin{bmatrix}
      5 & -2 & 2\\
      -2 & 4 & 0 \\
      2 & 0 & 1
      \end{bmatrix}$

      Eigen values and eigen vectors of $A^TA$ are

      eigen values are $\lambda_1=7,\lambda_2=3,\lambda_3=0$
      the corresponding normalized eigen vectors are 
      $V_1=\frac{1}{\sqrt{14}}
      \begin{bmatrix}
      3\\
      -2\\
      1
      \end{bmatrix}$
      $V_2=\frac{1}{\sqrt{6}}\begin{bmatrix}
      1\\
      2\\
      1
      \end{bmatrix}$
      $V_3=\frac{4}{\sqrt{21}}
      \begin{bmatrix}
      \frac{1}{2}\\
      \frac{1}{4}\\
      -1
      \end{bmatrix}$
      Compute the left singular vectors as the eigen basis of $AA^T$
      $A.A^T=\begin{bmatrix}
      2 & 0 & 1\\
      1 & -2 & 0
      \end{bmatrix}\begin{bmatrix}
      2 & 1 \\
      0 &-2\\
      1&0
      \end{bmatrix}=\begin{bmatrix}
      5 & 2\\
      2 & 5 \\
      \end{bmatrix}$

      eigen values are $\lambda_1=7,\lambda_2=3$the corresponding normalized eigen vectors are
      $U_1=\begin{bmatrix}
      \frac{1}{\sqrt{2}}\\
      \frac{1}{\sqrt{2}}
      \end{bmatrix}$
      $U_2=\begin{bmatrix}
      \frac{1}{\sqrt{2}}\\
      \frac{-1}{\sqrt{2}}
      \end{bmatrix}$

      So the SVD is $U \Sigma V^T$
      $\begin{bmatrix}
      2 & 0 & 1\\
      1 & -2 & 0
      \end{bmatrix}=\begin{bmatrix}
      \frac{1}{\sqrt{2}} &\frac{1}{\sqrt{2}}\\
      \frac{1}{\sqrt{2}} &\frac{-1}{\sqrt{2}}
      \end{bmatrix}
      \begin{bmatrix}
      \sqrt{7} & 0 & 0\\
      0& \sqrt{3} & 0
      \end{bmatrix}
      \begin{bmatrix}
      \frac{3}{\sqrt{14}} &\frac{-2}{\sqrt{14}} & \frac{1}{\sqrt{14}}\\
      \frac{1}{\sqrt{6}} &\frac{2}{\sqrt{6}} & \frac{1}{\sqrt{6}} \\
      \frac{2}{\sqrt{21}} &\frac{1}{\sqrt{21}} & \frac{-4}{\sqrt{21}}\\
      \end{bmatrix}$


      Comments

      Popular posts from this blog

      Mathematics for Machine Learning- CST 284 - KTU Minor Notes - Dr Binu V P

        Introduction About Me Syllabus Course Outcomes and Model Question Paper Question Paper July 2021 and evaluation scheme Question Paper June 2022 and evaluation scheme Overview of Machine Learning What is Machine Learning (video) Learn the Seven Steps in Machine Learning (video) Linear Algebra in Machine Learning Module I- Linear Algebra 1.Geometry of Linear Equations (video-Gilbert Strang) 2.Elimination with Matrices (video-Gilbert Strang) 3.Solving System of equations using Gauss Elimination Method 4.Row Echelon form and Reduced Row Echelon Form -Python Code 5.Solving system of equations Python code 6. Practice problems Gauss Elimination ( contact) 7.Finding Inverse using Gauss Jordan Elimination  (video) 8.Finding Inverse using Gauss Jordan Elimination-Python code Vectors in Machine Learning- Basics 9.Vector spaces and sub spaces 10.Linear Independence 11.Linear Independence, Basis and Dimension (video) 12.Generating set basis and span 13.Rank of a Matrix 14.Linear Mapping and Matri

      1.1 Solving system of equations using Gauss Elimination Method

      Elementary Transformations Key to solving a system of linear equations are elementary transformations that keep the solution set the same, but that transform the equation system into a simpler form: Exchange of two equations (rows in the matrix representing the system of equations) Multiplication of an equation (row) with a constant  Addition of two equations (rows) Add a scalar multiple of one row to the other. Row Echelon Form A matrix is in row-echelon form if All rows that contain only zeros are at the bottom of the matrix; correspondingly,all rows that contain at least one nonzero element are on top of rows that contain only zeros. Looking at nonzero rows only, the first nonzero number from the left pivot (also called the pivot or the leading coefficient) is always strictly to the right of the  pivot of the row above it. The row-echelon form is where the leading (first non-zero) entry of each row has only zeroes below it. These leading entries are called pivots Example: $\begin

      4.3 Sum Rule, Product Rule, and Bayes’ Theorem

       We think of probability theory as an extension to logical reasoning Probabilistic modeling  provides a principled foundation for designing machine learning methods. Once we have defined probability distributions corresponding to the uncertainties of the data and our problem, it turns out that there are only two fundamental rules, the sum rule and the product rule. Let $p(x,y)$ is the joint distribution of the two random variables $x, y$. The distributions $p(x)$ and $p(y)$ are the corresponding marginal distributions, and $p(y |x)$ is the conditional distribution of $y$ given $x$. Sum Rule The addition rule states the probability of two events is the sum of the probability that either will happen minus the probability that both will happen. The addition rule is: $P(A∪B)=P(A)+P(B)−P(A∩B)$ Suppose $A$ and $B$ are disjoint, their intersection is empty. Then the probability of their intersection is zero. In symbols:  $P(A∩B)=0$  The addition law then simplifies to: $P(A∪B)=P(A)+P(B)$  wh