we said that two vectors $v$ and $w$ are orthogonal if their dot product, $v . w$, is 0. In $R^2$ or $R^3$ this matches our geometric understanding of orthogonal, and in higher dimensions the idea still applies, even though we can’t visualize it.
Consider the vectors $a=\begin{pmatrix}
1 \\
2 \\
3\\
4
\end{pmatrix}, b=\begin{pmatrix}
2 \\
1 \\
0\\
-1
\end{pmatrix}$
These two vectors are orthogonal because their dot product $1.2+2.1+3.0+4.-1=0$
Now, we can extend these definitions to subspaces of a vector space.
Definition -Two subspaces $V$ and $W$ of a vector space are orthogonal if every vector $v \in V$ is perpendicular to every vector $w \in W$.
As a simple example, in $R^2$ the span of $\begin{pmatrix}1 \\
0 \\
\end{pmatrix}$ is the set of all vectors of the form $\begin{pmatrix}c \\
0 \\
\end{pmatrix}$, where $c$ is some real number, while the span of $\begin{pmatrix}0\\
1 \\
\end{pmatrix}$ is the set of all vectors of the form $\begin{pmatrix}0\\
d \\
\end{pmatrix}$,where $d$ is some real number. The dot product $v.w = v^Tw$ of any vector $v$ in the span of $\begin{pmatrix}1\\
0 \\
\end{pmatrix}$ with any vector $w$ in the span of $\begin{pmatrix}0\\
1 \\
\end{pmatrix}$ will be: $(c \quad 0)\begin{pmatrix}
0\\
d \\
\end{pmatrix}=0+0=0$. So, the two spans are orthogonal subspaces of $R^2$. Stated a bit more familiarly, the x-axis and y-axis of the coordinate plane are perpendicular.
1 \\
2 \\
3\\
4
\end{pmatrix}, b=\begin{pmatrix}
2 \\
1 \\
0\\
-1
\end{pmatrix}$
These two vectors are orthogonal because their dot product $1.2+2.1+3.0+4.-1=0$
Now, we can extend these definitions to subspaces of a vector space.
Definition -Two subspaces $V$ and $W$ of a vector space are orthogonal if every vector $v \in V$ is perpendicular to every vector $w \in W$.
As a simple example, in $R^2$ the span of $\begin{pmatrix}1 \\
0 \\
\end{pmatrix}$ is the set of all vectors of the form $\begin{pmatrix}c \\
0 \\
\end{pmatrix}$, where $c$ is some real number, while the span of $\begin{pmatrix}0\\
1 \\
\end{pmatrix}$ is the set of all vectors of the form $\begin{pmatrix}0\\
d \\
\end{pmatrix}$,where $d$ is some real number. The dot product $v.w = v^Tw$ of any vector $v$ in the span of $\begin{pmatrix}1\\
0 \\
\end{pmatrix}$ with any vector $w$ in the span of $\begin{pmatrix}0\\
1 \\
\end{pmatrix}$ will be: $(c \quad 0)\begin{pmatrix}
0\\
d \\
\end{pmatrix}=0+0=0$. So, the two spans are orthogonal subspaces of $R^2$. Stated a bit more familiarly, the x-axis and y-axis of the coordinate plane are perpendicular.
For a slightly more complicated example, let’s examine an $m \times n$ matrix $A$. The row space of $A$ is a subspace of $R^n$, as is the nullspace of $A$. These two subspaces will, in fact, be orthogonal. This is pretty quick given the definitions of row space, nullspace, and matrix multiplication. Suppose $x$ is a vector in the nullspace of $A$. This means $Ax=0$. From the definition of matrix multiplication we know:
$Ax=\begin{pmatrix}row 1\\
row 2 \\
.... \\
row m\end{pmatrix} \begin{pmatrix}
x1\\
x 2 \\
.... \\
xn
\end{pmatrix}=0$
The dot product of $x$ with each of the rows must be 0. As the row space
is the set of linear combinations of the rows, the dot product of $x$ with any
vector in the row space must be 0. So, if $v \in C(A^T)$ and $w \in N(A)$ , we
must have $v^Tw = 0$. This means the row space and nullspace of $A$ are
orthogonal.
Similarly, every vector in the left nullspace of $A, N(A^T)$, is perpendicular to every vector in the column space of $A, C(A)$. So, the column space
of $A$ and the left nullspace of $A$ are orthogonal.
Example -Find a vector perpendicular to the row space of the matrix
$\begin{pmatrix}1 & 3 & 4\\
5 & 2 & 7 \\
\end{pmatrix}$
1 & 3 & 4\\
5 & 2 & 7 \\
\end{pmatrix}\begin{pmatrix}
x1\\
x2\\
x3
\end{pmatrix}=\begin{pmatrix}
0\\
0 \\
0
\end{pmatrix}$
After row reduction
$\begin{pmatrix}
1 & 0 & 1\\
0 & 1 & 1 \\
\end{pmatrix}\begin{pmatrix}
x1\\
x2\\
x3
\end{pmatrix}=\begin{pmatrix}
0\\
0 \\
0
\end{pmatrix}$
1 & 0 & 1\\
0 & 1 & 1 \\
\end{pmatrix}\begin{pmatrix}
x1\\
x2\\
x3
\end{pmatrix}=\begin{pmatrix}
0\\
0 \\
0
\end{pmatrix}$
So the vector $\begin{pmatrix}1\\
1 \\
-1
\end{pmatrix}$ or any constant multiple of this vector is orthogonal.
1 \\
-1
\end{pmatrix}$ or any constant multiple of this vector is orthogonal.
Orthogonal Complement
If we’re given a subspace $V$ of a vector space, and another subspace $W$ orthogonal to it, a natural question to ask if it $W$ is the largest subspace orthogonal to $V$. Turns out the largest subspace orthogonal to $V$ is unique, and is defined as the orthogonal complement of $V$.
Definition -The orthogonal complement of a subspace $V$ contains every vector that is perpendicular to $V$. This orthogonal subspace is denoted by $V^\perp$ (pronounced “V perp”). We saw above that for a matrix A the nullspace $N(A)$ is perpendicular to the row space $C(A^T)$. It turns out the nullspace is in fact the orthogonal complement of the row space. We can see this by noting that if $A$ is an $m \times n$ matrix both the row space and the nullspace are subspaces of $R^n$. The dimension of the nullspace is $n - r$, where $r$ is the rank of $A$, which is also the dimension of the row space. If $x$ were a vector orthogonal to the row space, but not in the nullspace, then the dimension of $C(A^T)^\perp$ would be at least $n - r + 1$. But this would be too large for both $C(A^T)$ and $C(A^T)^\perp$ to fit in $R^n$. So, the nullspace is the largest subspace perpendicular to the row space, and we have $C(A^T)^\perp = N(A)$. Similarly $N(A) = C(A^T)$.
Example: Calculate $V^\perp$ is $V$ is the vector space given by
5 \\
4
\end{pmatrix} \begin{pmatrix}
3\\
7 \\
3\\
12
\end{pmatrix}$
Consider the matrix with vectors as rows
$\begin{pmatrix}
1 & 2 & 5 & 4\\
3 & 7 & 3 & 12 \\
\end{pmatrix}$
1 & 2 & 5 & 4\\
3 & 7 & 3 & 12 \\
\end{pmatrix}$
After row reduction
$\begin{pmatrix}
1 & 0 & 29 & 4\\
0 & 1 & -12 & 0 \\
\end{pmatrix}$
1 & 0 & 29 & 4\\
0 & 1 & -12 & 0 \\
\end{pmatrix}$
So the null space has dimension 2 with the following vectors
29\\
-12 \\-1 \\
0
\end{pmatrix} \begin{pmatrix}
4\\
0 \\
0\\
-1
\end{pmatrix}$
Comments
Post a Comment