Skip to main content

2.11 Eigen Value and Eigen Vectors

Eigenvalues and Eigenvectors
Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed.
Definition:
Let $A \in \mathbb{R}^{n \times n}$ be a square matrix. Then $\lambda \in \mathbb{R}$ is an eigen value of $A$ and $ x \in \mathbb{R}^n /{0} $ is the corresponding eigen vector of  $A$ if
      $Ax=\lambda x$
We call this as Eigenvalue equation


The following statements are equivalent

$\lambda$ is the eigen value of $A$
There exist an $x \in \mathbb{R}^n/0$ with $Ax=\lambda x$ or equivalently $(A-\lambda I_n)x=0$ can be solved non trivially i.e., $x \ne 0$
$rank(A-\lambda I) \le n$
$det(A-\lambda I) =0$

collinearity and codirection
Two vectors that point in the same direction are called codirected.Two vectors are collinear if they point in the same or opposite direction.
if $x$ is an eigen vector of $A$ associated with eigen values $\lambda$ then for any $c \in \mathbb{R}/0$, it holds that $cx$ is an eigenvector of $A$ with the same eigen value since
$A(cx) = cAx = c\lambda x = \lambda(cx)$
Thus all vectors that are collinear to $x$ are also eigenvectors of $A$

Characteristic Polynomial
For $\lambda \in \mathbb{R}$ and a square matrix $A \in \mathbb{R}^{n \times n}$

$P_A(\lambda)=det(A-\lambda I )=(-1)^n\lambda^n+C_{n-1}\lambda^{n-1}+\ldots+C_2\lambda^2+C_1\lambda+C_0$
The characteristic polynomial will allow us to compute eigen values and eigen vectors.
In the characteristic polynomial
   $C_0=det(A)$
  $C_{n-1}=(-1)^{n-1}tr(A)$

$\lambda \in \mathbb{R}$ is eigenvalue of $A \in \mathbb{R}^{n \times n}$ if and only if $\lambda$ is the root of the characteristic polynomial.

The algebraic multiplicity of $\lambda _i$ is the number of times the root appears in the characteristic polynomial.

Eigen space and eigen spectrum
For $A \in \mathbb{R}^{n \times n}$,the set of all eigen vectors associated with an eigen value $\lambda$ spans a subspace of $\mathbb{R}^n$, which is called the eigen space of $A$ with respect to $\lambda$ and is denoted by $E_{\lambda}$. 

The set of all eigen values of $A$ is called eigen spectrum or spectrum of $A$.

If $\lambda$ is the eigen value of $A \in \mathbb{R}^{n \times n}$, then the corresponding eigen space $E_\lambda$ is the solution space of the  homogenious system of linear equation $(A-\lambda I)x=0$. Geometrically the eigen vector corresponds to a non zero eigen value points in a direction that is stretched by the linear mapping. The eigen value is the factor by it is stretched.If the eigen value is negative , the direction of the stretching is flipped.

properties
The identity matrix $I \in \mathbb{R}^{n \times n}$ has characteristic polynomial $(1-\lambda)^n=0$, which has only one eigenvalue $\lambda=1$ that occures $n$ times.Moreover $Ix=1x$ holds for all vectors $x \in \mathbb{R}^n$.Because of this eigenspace spans all the dimensions and all standard basis vectors of $\mathbb{R}^n$ are eigen vectors of $I$.

A matrix $A$ and its transpose $A^T$ having the same eigen values, but not necessarily the same eigen vectors.

The eigen space is the null space(kernal)  of $(A-\lambda I)$

Similar matrices having same eigen value.Therefore linear mapping matrix has an eigen value that is independent of the basis.

symmetric positive definite matrices always have positive real eigen values.

Determinant of a matrix is the product of its eigen values.

Trace of a matrix is the sum of its eigen values.

$Ax=\lambda x$ so $A^{-1}Ax=A^{-1}\lambda x$ ie; $A^{-1}x=\frac{1}{\lambda}x$ ie; eigen values of $A^{-1}$ is $\frac{1}{\lambda}$

$Ax=\lambda x$ so $A^{2}x=\lambda^{2}x$ ie; in general $A^{k}x=\lambda^{k} x $

Applications

Eigenvalues and eigenvectors play a significant role in various aspects of machine learning. Some of the key applications of eigenvalues and eigenvectors in machine learning are as follows:

Dimensionality Reduction: One of the most common applications of eigenvalues and eigenvectors in machine learning is in dimensionality reduction techniques like Principal Component Analysis (PCA). PCA finds the principal components (eigenvectors) of the data, which are the directions along which the data varies the most. The corresponding eigenvalues indicate the amount of variance captured by each principal component. By retaining only the top eigenvalues and their corresponding eigenvectors, PCA allows for effective dimensionality reduction while preserving the most important patterns in the data.

Image Compression: Eigenvalues and eigenvectors are used in image compression techniques like Singular Value Decomposition (SVD). SVD decomposes an image matrix into its constituent eigenvalues and eigenvectors. By keeping only the most significant eigenvalues and their corresponding eigenvectors, images can be efficiently compressed without losing critical visual information.

Spectral Clustering: Spectral clustering is a powerful technique used in unsupervised learning to partition data into clusters. It utilizes the eigenvectors of similarity or affinity matrices to find the cluster assignments. Eigenvectors corresponding to the smallest eigenvalues provide information about the cluster structure of the data.

Recommender Systems: Collaborative filtering in recommender systems can use the concepts of eigenvectors to identify latent factors and represent user-item interactions in lower-dimensional spaces. This helps in making personalized recommendations efficiently.

Natural Language Processing: In natural language processing tasks like topic modeling or latent semantic analysis, eigenvectors can be used to identify latent topics or dimensions in the text data, making it easier to discover underlying structures and patterns.

Graph Analysis: In graph-based machine learning algorithms, eigenvectors are used to calculate centrality measures, such as PageRank in web graph analysis, which helps to rank web pages based on their importance.

Neural Networks: In some neural network architectures, such as autoencoders, the hidden layers' activations correspond to the eigenvectors of the covariance matrix of the input data. Understanding the eigenvectors can provide insights into the network's internal representations and how it captures relevant features in the data.

In summary, eigenvalues and eigenvectors have diverse applications in machine learning, ranging from dimensionality reduction and clustering to image compression and graph analysis. They enable efficient data representation, feature extraction, and structure discovery, making them powerful tools for understanding and processing complex data in various machine learning tasks.

Computing the Eigenvalues, eigen vectors and eigen spaces

Step1. Find the roots of the characteristic polynomial to find the eigen values.

Step2:Find the eigen vectors that corresponds to the eigen values by finding vectors $x$ such that $(A-\lambda I)x=0$, which the null space of  the matrix $A-\lambda I$.

Example 

Find the eigenvalues and eigenvectors for 

$A=\begin{bmatrix}
6&4 \\
-3&-1
\end{bmatrix}$

Solution

If 

$ Ax= \lambda x$

then

        $|A - \lambda I |= 0$

$A-\lambda I=\begin{bmatrix}
6-\lambda&4 \\
-3&-1-\lambda
\end{bmatrix}$

Taking determinants of both sides, we get the characteristic polynomial

  $(6 - \lambda)(-1 - \lambda) + 12 = 0$

$\lambda ^2 -5 \lambda +6=0$

$(\lambda - 2)(\lambda - 3) = 0$

The eigenvalues are 

        $\lambda= 2$ and $\lambda = 3$

To find the eigenvectors, we plug the eigenvalues into the equation 

         $ (A -\lambda I)x  = 0 $

and find the null space of the left hand side.  For the eigenvalue $\lambda =2$, we have

$ A-\lambda I=\begin{bmatrix}
4&4 \\
-3&-3
\end{bmatrix}$

The first row gives

$ x_1= -x_2$

so that an eigenvector corresponding to the eigenvalue $\lambda = 2$ is 

$ x=\begin{bmatrix}
1\\
-1
\end{bmatrix}$

For the eigenvalue  $\lambda= 3$, we have

$A-\lambda I=\begin{bmatrix}
3&4 \\
-3&-4
\end{bmatrix}$

The first row gives

    $3x_1 = -4x_2$

so that an eigenvector corresponding to the eigenvalue $\lambda= 3$ is 

$ x=\begin{bmatrix}
3\\
-4
\end{bmatrix}$

Typically, we want to normalize the eigenvectors, that is find unit eigenvectors.  We get

$ v_1=\begin{bmatrix}\frac{1}{\sqrt{2}}\\
\frac{-1}{\sqrt{2}}
\end{bmatrix}$
and
$ v_2=\begin{bmatrix}
\frac{3}{5}\\
\frac{-4}{5}
\end{bmatrix}$

import numpy as np
A=np.array([[6,4],[-3,-1]])
print(" Matrix A")
print(A)
eigva,eigv=np.linalg.eig(A)
print("eigen values")
print(eigva)
print("eigen vectors normalized")
print(eigv)

o/p
Matrix A
 [[ 6 4] 
 [-3 -1]] 
eigen values 
[3. 2.] 
eigen vectors 
[[ 0.8 -0.70710678] 
 [-0.6 0.70710678]]

import numpy as np
A=np.array([[4,2],[1,3]])
print(" Matrix A")
print(A)
eigva,eigv=np.linalg.eig(A)
print("eigen values")
print(eigva)
print("eigen vectors un normalized")
print(eigv[:,0]*np.sqrt(5))
print(eigv[:,1]*np.sqrt(2))

o/p
Matrix A
[[4 2] 
 [1 3]]
 eigen values 
[5. 2.] 
eigen vectors un normalized
 [2. 1.]
 [-1. 1.]

sympy code 
from sympy import Matrix
M = Matrix(3, 3, [2, 1,-2, 1, 0, 0, 1, 0, 0])
print(M.charpoly().as_expr())
print(M.eigenvals())
print(M.eigenvects())
output:
lambda**3 - 2*lambda**2 + lambda 
{1: 2, 0: 1} 
[(0, 1, [Matrix([ [0], [2], [1]])]), (1, 2, [Matrix([ [1], [1], [1]])])]

Geometric Multiplicity and algebraic multiplicity

Let $\lambda_i$ be an eigenvalue of a square matrix A. Then the geometric  multiplicity of $\lambda_i$ is the number of linearly independent eigen vectors associated with $\lambda_i$.
The algebraic multiplicity specifies how many times the eigen value $\lambda_i$ occurs.

Remark. A specific eigenvalue’s geometric multiplicity must be at least one because every eigenvalue has at least one associated eigenvector. An eigenvalue’s geometric multiplicity cannot exceed its algebraic multiplicity,but it may be lower.
Example:
The matrix $A=\begin{bmatrix}
2 & 1\\
0& 2
\end{bmatrix}$ has two repeated eigen values $\lambda_1=\lambda_2=2$ and hence an algebraic multiplicity of 2. The eigenvalue has however only one distinct eigen vector $x=\begin{bmatrix}
1\\
0
\end{bmatrix}$ and thus geometric multiplicity 1.

Eigen Basis

The eigenvectors $x_1,x_2,\ldots, x_n$ of a matrix $A \in \mathbb{R}^{n \times n}$ with $n$ distinct eigenvalues $\lambda_1,\lambda_2,\ldots,\lambda_n$ are linearly independent.This states that eigen vectors of a matrix with $n$ distinct eigen values form a basis of $\mathbb{R}^n$.

A square matrix  $A \in \mathbb{R}^{n \times n}$  is defective, if it possesses fewer than $n$ linearly independent eigen vectors.It cannot have $n$ distict eigenvalues.

Spectral Theorem
Given a matrix $A \in \mathbb{R}^{n \times n}$, we can always obtain a symmetric positive definite matrix $S \in \mathbb{R}^{n \times n}$ by defining
$S=A^TA$
If $rk(A)=n$, then $S=A^TA$ is symmetric, positive definite.
Symmetry: Symmetry requires $S=S^T$
So $S=(A^TA)=A^T.(A^T)^T=(A^TA)^T=S^T$
Positive Semi Definiteness: this requires $x^TSx \ge 0$ i.e.,$x^TSx=x^TA^TAx=(x^TA^T)(Ax)=(Ax)^T(Ax) \ge 0$. The dot product is sum of squares which are non negative.

If $A \in \mathbb{R}^{n \times n}$ is symmetric, there exists an orthonormal basis of the corresponding vector space $v$ consisting of eigenvectors of $A$, and each eigen value is real.

The direct implication of spectral theorem is that eigen decomposition of a symmetric matrix $A$ exist and we can find ortho normal basis of eigen vectors so that $A=PDP^T$, where $D$ is the diagonal matrix with eigen values and columns of $P$ contains eigen vectors. Eigen decomposition will be discussed later.

The eigen vectors associated with the same eigen values are not orthogonal( geometric multiplicity greater than 1). We can use Gram-Schidt algorithm for orthogonalization in this case.

Relationship Between Determinant and Trace

The determinant of a matrix $A \in \mathbb{R}^{n \times n}$ is the product of its eigenvalues .i.e,

$det(A)=\prod_{i=1}^{n} \lambda_i$

The trace of a matrix $A \in \mathbb{R}^{n \times n}$ is the sum of its eigen values.i.e,

$tr(A)=\sum_{i=1}^{n} \lambda_i$

where $\lambda_i$ are eigenvalues of $A$.

More Example:
Compute the Eigen value and eigen vectors of 
$A=\begin{bmatrix}
1 & 0\\
1 & 1
\end{bmatrix}$

Characteristic polynomial
$p(\lambda)=|A-\lambda I|=\begin{vmatrix}
1-\lambda & 0\\
1 & 1-\lambda
\end{vmatrix}=(1-\lambda)^2=0$ 
So $\lambda=1$ is the only root of $p$ and therefore  the only eigenvalue of $A$
To compute the eigen space, we need to compute the null space of $A-\lambda I$
$(A-1.I)x=0 \implies \begin{bmatrix}
0 & 0\\
1 & 0
\end{bmatrix}x=0$

So the only eigen vector is 
$E_1=\begin{bmatrix}
0 \\
1
\end{bmatrix}$

Compute the Eigen value and eigen vectors of 
$A=\begin{bmatrix}
-2 & 2\\
2 & 1
\end{bmatrix}$
Characteristic polynomial
$p(\lambda)=|A-\lambda I|=\begin{vmatrix}
-2-\lambda & 2\\
2& 1-\lambda
\end{vmatrix}=(-2-\lambda)(1-\lambda)-4=0$ 

$\lambda^2 + \lambda -6 =0$

so the eigenvalues are $\lambda=-3$ and $\lambda=2$ .Note that the product -6 is the determinant of the matrix and the sum -1 is the trace ( sum of the diagonal element)

when $\lambda=-3$ the null space of $A-\lambda I$ is
$A-\lambda I=\begin{bmatrix}
1&2 \\
2&4
\end{bmatrix}$

is 

$E_1=\begin{bmatrix}
2 \\
-1
\end{bmatrix}$ and is the eigen space corresponds to the eigen value $\lambda=-3$

when $\lambda=2$ the null space of $A-\lambda I$ is
$A-\lambda I=\begin{bmatrix}
-4&2 \\
2&-1
\end{bmatrix}$

is 

$E_2=\begin{bmatrix}
1/2 \\
1
\end{bmatrix}$ and is the eigen space corresponds to the eigen value $\lambda=2$

Find the eigen values of the following matrix in terms of $k$. Can you find an eigen vector corresponding to each of the eigen values?

$\begin{bmatrix}
1 & k\\
2 & 1
\end{bmatrix}$
Characteristic polynomial can be found by
$p(\lambda)=|A-\lambda I|=\begin{vmatrix}
1-\lambda & k\\
2& 1-\lambda
\end{vmatrix}==0$ 
$(1-\lambda)^2-2k=0$
$\lambda^2-2\lambda+(1-2k)=0$
$\lambda= \frac{2 \pm \sqrt{4-4(1-2k)}}{2}$
$\lambda=\frac{2 \pm 2\sqrt{2k}}{2}$
$\lambda=1 \pm \sqrt{2k}$ are the eigen values

Lets find the eigen vectors
When $\lambda=1 + \sqrt{2k}$ the null space of  $(A-\lambda I)$ gives eigen vector corresponds to the eigen value $1 + \sqrt{2k}$
$\begin{bmatrix}
1-1-\sqrt{2k} & k\\
2 & 1-1-\sqrt{2k}
\end{bmatrix}$
$\begin{bmatrix}
-\sqrt{2k} & k\\
2 & -\sqrt{2k}
\end{bmatrix}$

$\begin{bmatrix}
1 & -\sqrt{k/2}\\
1 & -\sqrt{k/2}
\end{bmatrix}$

$\begin{bmatrix}
1 & -\sqrt{k/2}\\
0 & 0
\end{bmatrix}$
So the null space is $\begin{bmatrix}
\sqrt{k/2}\\
1
\end{bmatrix}$, which is the eigen vector

similarly when $\lambda=1 - \sqrt{2k}$ the null space of $(A-\lambda I)$ gives eigen vector corresponds to the eigen value $1 - \sqrt{2k}$
and the eigen vector is
$\begin{bmatrix}
-\sqrt{k/2}\\
1
\end{bmatrix}$

Compute the Eigen value of a 3 x 3 matrix
$A=\begin{bmatrix}
-1 & 2 & 2\\
2& 2 & -1\\
2 & -1 & 2
\end{bmatrix}$

Characteristic Polynomial can be found by setting $|A-\lambda I|=0$

$\begin{bmatrix}
-1-\lambda & 2 & 2\\
2& 2-\lambda & -1\\
2 & -1 & 2-\lambda
\end{bmatrix}=0$

$(-1-\lambda)(4-4\lambda+\lambda^2-1)-2(4-2\lambda+2)+2(-2-4+2\lambda)$

$(-1-\lambda)(3-4\lambda+\lambda^2)-2(6-2\lambda)+2(-6+2\lambda)$

$-\lambda^3+3\lambda^2+9\lambda-27=0$

 $\lambda^3-3\lambda^2-9\lambda+27=0$

$(\lambda-3)(\lambda^2-9)=0$

$(\lambda-3)(\lambda+3)(\lambda-3)=0$

So the eigen values  are $\lambda=3$(with algebraic multiplicity 2) and $\lambda=-3$

Find the eigen values and eigen vectors of the following matrix

$A=\begin{bmatrix}
5 & -2 & 1\\
-2& 1 & 0\\
1 & 0 & 1
\end{bmatrix}$

Characteristic Polynomial can be found by setting $|A-\lambda I|=0$

$\begin{vmatrix}
5-\lambda & -2 & 1\\
-2& 1-\lambda & 0\\
1 & 0 & 1-\lambda
\end{vmatrix}$
$(5-\lambda)(1-\lambda)^2+2(-2(1-\lambda))-(1-\lambda)$
$(5-\lambda)(\lambda^2-2\lambda+1)-4+4\lambda-1+\lambda$
$(5-\lambda)(\lambda^2-2\lambda+1)+5\lambda-5$
$5\lambda^2-10\lambda+5-\lambda^3+2\lambda^2-\lambda+5\lambda-5$
$\lambda^3-7\lambda^2+6\lambda$
$\lambda(\lambda^2-7\lambda+6)$
So the eigen values are $\lambda=0$,$\lambda=6$, $\lambda=1$
The eigen vector corresponds to $\lambda=0$ is the null space of $A-\lambda I$
$A=\begin{bmatrix}
5 & -2 & 1\\
-2& 1 & 0\\
1 & 0 & 1
\end{bmatrix}$
Exchanging $R_1$ and $R_3$
$A=\begin{bmatrix}
1 & 0 & 1\\
-2& 1 & 0\\
5 & -2 & 1
\end{bmatrix}$
$R_2=-2 \times R_1 +R_2$
$A=\begin{bmatrix}
1 & 0 & 1\\
0& 1 & 2\\
5 & -2 & 1
\end{bmatrix}$
$R_3=-5 \times R_1 +R_3$ then $R_3=R_3/-2$
$A=\begin{bmatrix}
1 & 0 & 1\\
0& 1 & 2\\
0 & 1 &2
\end{bmatrix}$
$R_3=R_3-R_2$
$A=\begin{bmatrix}
1 & 0 & 1\\
0& 1 & 2\\
0 & 0 &0
\end{bmatrix}$
So the eigen vector corresponds to the eigen value $\lambda=0$ is
$\begin{bmatrix} 1\\
2\\
-1
\end{bmatrix}$
Similarly the eigen vector corresponds to the eigen value $\lambda=1$ is
$\begin{bmatrix} 0\\
1\\
2
\end{bmatrix}$
Similarly the eigen vector corresponds to the eigen value $\lambda=6$ is
$\begin{bmatrix}5\\
-2\\
1
\end{bmatrix}$

Find the characteristic equation, eigenvalues, and eigenspaces corresponding to each eigenvalue of the following matrix
$A=\begin{bmatrix}
2 & 0 & 4\\
0 & 3 & 0\\
0 & 1 & 2
\end{bmatrix}$
Characteristic Polynomial can be found by setting $|A-\lambda I|=0$
$\begin{vmatrix}
2-\lambda & 0 & 4\\
0 & 3-\lambda & 0\\
0 & 1 & 2-\lambda
\end{vmatrix}=0$

$(2-\lambda)(3-\lambda)((2-\lambda)=0$
So
$\lambda=2$ or $\lambda=3$
The eigen vector corresponds to $\lambda=2$ can be found by finding the null space of $A-\lambda I$
$\begin{vmatrix}
0 & 0 & 4\\
0 & 1 & 0\\
0 & 1 & 0
\end{vmatrix}$
the null space is 
$\begin{bmatrix}
1\\
0\\
0
\end{bmatrix}$,

Similarly the eigenvector corresponds to $\lambda=3$ can be found by finding the null space of $A-\lambda I$
$\begin{vmatrix}
-1 & 0 & 4\\
0 &0 & 0\\
0 & 1 &-1
\end{vmatrix}$
the null space is 
$\begin{bmatrix}
-4\\
-1\\
-1
\end{bmatrix}$,

So the eigen space consist of following vectors
$Ev_1=\begin{bmatrix}
1\\
0\\
0
\end{bmatrix}$
and 
$Ev_2=\begin{bmatrix}
4\\
1\\
1
\end{bmatrix}$

Find the eigen values and eigen vectors corresponds to 
$A=\begin{bmatrix}
4 & 2\\
1 & 3\\
\end{bmatrix}$
 ( university question)

Characteristic Polynomial can be found by setting $|A-\lambda I|=0$
$\begin{vmatrix}
4-\lambda & 2\\
1 & 3-\lambda \\
\end{vmatrix}=0$

$(4-\lambda)(3-\lambda)-2 =0$
$\lambda ^2 - 7 \lambda+10=0$
$(\lambda-5)(\lambda -2)=0$
So
$\lambda=2$ or $\lambda=5$
The eigen vector corresponds to $\lambda=2$ can be found by finding the null space of $A-\lambda I$
$\begin{vmatrix}
2 & 2 \\
1 & 1 \\
\end{vmatrix}$
the null space is 
$\begin{bmatrix}
1\\
-1\\
\end{bmatrix}$,
The eigen vector corresponds to $\lambda=5$ can be found by finding the null space of $A-\lambda I$
$\begin{vmatrix}
-1 & 2 \\
1 & -2 \\
\end{vmatrix}$
the null space is 
$\begin{bmatrix}
2\\
1\\
\end{bmatrix}$,

So the eigen values are
$\lambda_1=2$ and $\lambda_2=5$
and the corresponding eigen vectors are
$\begin{bmatrix}
1\\
-1\\
\end{bmatrix}$, and
$\begin{bmatrix}
2\\
1\\
\end{bmatrix}$

Find the characteristic equation. Eigen values and Eigen spaces for the following matrix.
( university qn)
$\begin{bmatrix}
2 & 1 & -2\\
1 & 0 & 0\\
1 & 0 & 0
\end{bmatrix}$

Characteristic equation can be found by evaluating $|A-\lambda I|=0$
$\begin{vmatrix}
2-\lambda & 1 & -2\\
1 & -\lambda & 0\\
1 & 0 & \lambda
\end{vmatrix}=0$

$(2-\lambda) \lambda^2-1(-\lambda)-2(\lambda)=0$
$2\lambda^2-\lambda^3+\lambda-2 \lambda=0$
$\lambda^3-2\lambda^2+\lambda=0$
$\lambda(\lambda^2-2\lambda+1)=0$    characteristic equation
$\lambda(\lambda-1)(\lambda-1)=0$
$\lambda=0$ or $\lambda=1$  eigen values( $\lambda=1$ has algebraic multiplicity=2)

The eigen vector corresponds to $\lambda=0$ can be found by finding the null space of $A-\lambda I$

$\begin{bmatrix}2 & 1 & -2\\
1 & 0 & 0\\
1 & 0 & 0
\end{bmatrix}x=0$
So $x=\begin{bmatrix}
0\\
2\\

\end{bmatrix}$
Similarly the eigenvector corresponds to $\lambda=1$ is
$\begin{bmatrix}
1 & 1 & -2\\
1 & -1 & 0 \\
1 & 0 & -1
\end{bmatrix}x=0$
$x=\begin{bmatrix}
1\\
1\\

\end{bmatrix}$

Find the eigen values and eigen vectors corresponds to  ( university question)

$A=\begin{bmatrix}
8 & 4\\
2 & 6\\
\end{bmatrix}$

Characteristic Polynomial can be found by setting $|A-\lambda I|=0$
$\begin{vmatrix}
8-\lambda & 4\\
2 & 6-\lambda \\
\end{vmatrix}=0$

$(8-\lambda)(6-\lambda)-8 =0$
$\lambda ^2 - 14 \lambda+40=0$
$(\lambda-4)(\lambda -10)=0$
So
$\lambda=4$ or $\lambda=10$
The eigen vector corresponds to $\lambda=4$ can be found by finding the null space of $A-\lambda I$
$\begin{vmatrix}
4 & 4 \\
2 & 2 \\
\end{vmatrix}$
the null space is 
$\begin{bmatrix}
1\\
-1\\
\end{bmatrix}$,
The eigen vector corresponds to $\lambda=10$ can be found by finding the null space of $A-\lambda I$
$\begin{vmatrix}
-2 & 4 \\
2 & -4 \\
\end{vmatrix}$
the null space is 
$\begin{bmatrix}
2\\
1\\
\end{bmatrix}$,

So the eigen values are
$\lambda_1=4$ and $\lambda_2=10$
and the corresponding eigen vectors are
$\begin{bmatrix}
1\\
-1\\
\end{bmatrix}$, and
$\begin{bmatrix}
2\\
1\\
\end{bmatrix}$


Comments

Popular posts from this blog

Mathematics for Machine Learning- CST 284 - KTU Minor Notes - Dr Binu V P

  Introduction About Me Syllabus Course Outcomes and Model Question Paper Question Paper July 2021 and evaluation scheme Question Paper June 2022 and evaluation scheme Overview of Machine Learning What is Machine Learning (video) Learn the Seven Steps in Machine Learning (video) Linear Algebra in Machine Learning Module I- Linear Algebra 1.Geometry of Linear Equations (video-Gilbert Strang) 2.Elimination with Matrices (video-Gilbert Strang) 3.Solving System of equations using Gauss Elimination Method 4.Row Echelon form and Reduced Row Echelon Form -Python Code 5.Solving system of equations Python code 6. Practice problems Gauss Elimination ( contact) 7.Finding Inverse using Gauss Jordan Elimination  (video) 8.Finding Inverse using Gauss Jordan Elimination-Python code Vectors in Machine Learning- Basics 9.Vector spaces and sub spaces 10.Linear Independence 11.Linear Independence, Basis and Dimension (video) 12.Generating set basis and span 13.Rank of a Matrix 14.Linear Mapping and Matri

1.1 Solving system of equations using Gauss Elimination Method

Elementary Transformations Key to solving a system of linear equations are elementary transformations that keep the solution set the same, but that transform the equation system into a simpler form: Exchange of two equations (rows in the matrix representing the system of equations) Multiplication of an equation (row) with a constant  Addition of two equations (rows) Add a scalar multiple of one row to the other. Row Echelon Form A matrix is in row-echelon form if All rows that contain only zeros are at the bottom of the matrix; correspondingly,all rows that contain at least one nonzero element are on top of rows that contain only zeros. Looking at nonzero rows only, the first nonzero number from the left pivot (also called the pivot or the leading coefficient) is always strictly to the right of the  pivot of the row above it. The row-echelon form is where the leading (first non-zero) entry of each row has only zeroes below it. These leading entries are called pivots Example: $\begin

4.3 Sum Rule, Product Rule, and Bayes’ Theorem

 We think of probability theory as an extension to logical reasoning Probabilistic modeling  provides a principled foundation for designing machine learning methods. Once we have defined probability distributions corresponding to the uncertainties of the data and our problem, it turns out that there are only two fundamental rules, the sum rule and the product rule. Let $p(x,y)$ is the joint distribution of the two random variables $x, y$. The distributions $p(x)$ and $p(y)$ are the corresponding marginal distributions, and $p(y |x)$ is the conditional distribution of $y$ given $x$. Sum Rule The addition rule states the probability of two events is the sum of the probability that either will happen minus the probability that both will happen. The addition rule is: $P(A∪B)=P(A)+P(B)−P(A∩B)$ Suppose $A$ and $B$ are disjoint, their intersection is empty. Then the probability of their intersection is zero. In symbols:  $P(A∩B)=0$  The addition law then simplifies to: $P(A∪B)=P(A)+P(B)$  wh