we said that two vectors $v$ and $w$ are orthogonal if their dot product, $v . w$, is 0. In $R^2$ or $R^3$ this matches our geometric understanding of orthogonal, and in higher dimensions the idea still applies, even though we can’t visualize it. Consider the vectors $a=\begin{pmatrix} 1 \\ 2 \\ 3\\ 4 \end{pmatrix}, b=\begin{pmatrix} 2 \\ 1 \\ 0\\ -1 \end{pmatrix}$ These two vectors are orthogonal because their dot product $1.2+2.1+3.0+4.-1=0$ Now, we can extend these definitions to subspaces of a vector space. Definition -Two subspaces $V$ and $W$ of a vector space are orthogonal if every vector $v \in V$ is perpendicular to every vector $w \in W$. As a simple example, in $R^2$ the span of $\begin{pmatrix}1 \\ 0 \\ \end{pmatrix}$ is the set of all vectors of the form $\begin{pmatrix}c \\ 0 \\ \end{pmatrix}$, where $c$ is some real number, while the span of $\begin{pmatrix}0\\ 1 \\ \end{pmatrix}$ is the set of all vectors of the form $\begin{pmatrix}0\\ d \\ \end{pmatrix}$,where $...
This blog is written for the following two courses of KTU using python. CST284-Mathematics for Machine Learning-KTU Minor course and CST294-Computational Fundamentals for Machine Learning-KTU honors course. Queries can be send to Dr Binu V P. 9847390760