We found that in an $n$-dimensional vector space, we need $n$ basis vectors, i.e., $n$ vectors that are linearly independent. We will discuss the special case where the basis vectors are orthogonal to each other and where the length of each basis vector is 1. We will call this basis then an orthonormal basis.
Consider an $n$ dimensional vector space $V$ and a basis ${b_1,\ldots,b_n}$ of $V$.If
$<b_i,b_j>=0$ for $i \ne j$ ------(1)
$<b_i,b_i>=1$ ------(2)
for all $i,j=1,\ldots,n$, then the basis is called orthonormal basis (ONB).If only eqn (1) is satisfied, then the basis is called an orthogonal basis.Eqn(2) implies that every basis vector has length/norm 1.
Note: we can use Gaussian elimination to find a basis for a vector space spanned by a set of vectors. Assume we are given a set $\{\tilde{b_1},\ldots,\tilde{b_n}\}$ of non-orthogonal and unnormalized basis vectors. We concatenate them into a matrix $\tilde{B} = [\tilde{b_1},\ldots,\tilde{b_n}]$ and apply Gaussian elimination to the augmented matrix $ [\tilde{B}\tilde{B}^T|\tilde{B}]$ to obtain an orthonormal basis. This constructive way to iteratively build an orthonormal basis $\{b_1,\ldots,b_n\}$ is called the Gram-Schmidt process.
Example:
The canonical/standard basis for a Euclidean vector space $\mathbb{R}^n$ is an orthonormal basis, where the inner product is the dot product of vectors.
In $\mathbb{R}^2$, the vectors
1\\
1
\end{bmatrix}
b_2=\frac{1}{\sqrt{2}}\begin{bmatrix}
1\\
-1
\end{bmatrix}$
form an orthonormal basis since $b_1^Tb_2=0$ and $\left \| b_1 \right\|=1=\left \| b_2 \right\|$
Comments
Post a Comment