Skip to main content

Posts

Showing posts from April, 2020

Linear Regression

Linear regression is a method for modeling the relationship between two scalar values: the input variable x and the output variable y. The model assumes that $y$ is a linear function or a weighted sum of the input variable. $y = f(x)$ Or, stated with the coefficients. $y = b_0 + b_1 x_1$ The model can also be used to model an output variable given multiple input variables called multivariate linear regression $y = b_0 + (b_1.x_1) + (b_2.x_2) + ....+(b_n.x_n)$ The objective of creating a linear regression model is to find the values for the coefficient values ($b$) that minimize the error in the prediction of the output variable $y$.   Matrix Formulation of Linear Regression   Linear regression can be stated using Matrix notation; for example: $y = X.b$   Where $X$ is the input data and each column is a data feature, $b$ is a vector of coefficients and $y$ is a vector of output variables for each row in $X$.   Reformulated, the problem becomes a system of linea...

PCA-Principle Component Analysis

An important machine learning method for dimensionality reduction is called Principal Component Analysis. It is a method that uses simple matrix operations from linear algebra and statistics to calculate a projection of the original data into the same number or fewer dimensions. In this tutorial, you will discover the Principal Component Analysis machine learning method for dimensionality reduction and how to implement it from scratch in Python. Principal Component Analysis, or PCA for short, is a method for reducing the dimensionality of data. It can be thought of as a projection method where data with m-columns (features) is projected into a subspace with m or fewer columns, whilst retaining the essence of the original data. The PCA method can be described and implemented using the tools of linear algebra PCA is an operation applied to a dataset, represented by an $nxm$ matrix $A$ that results in a projection of $A$ which we will call $B$.  Let's walk through the steps o...

Introduction to statistics with python

Fundamental statistics are useful tools in applied machine learning for a better understanding your data. They are also the tools that provide the foundation for more advanced linear algebra operations and machine learning methods, such as the covariance matrix and principal component analysis respectively. As such, it is important to have a strong grip on fundamental statistics in the context of linear algebra notation. In this, you will discover how fundamental statistical operations work and how to implement them using NumPy. Expected Value and Mean In probability, the average value of some random variable X is called the expected value or the expectation. The expected value uses the notation E with square brackets around the name of the variable;  for example: E[X] It is calculated as the probability weighted sum of values that can be drawn. E[X] =x1.p1+x2. p2+x3.p3+.................. +xn .pn In simple cases, such as the  flipping of a coin or rolling a d...

Matrix Decomposition-LU , QR , Cholesky , Eigen , SVD

Matrix Decomposition Many complex matrix operations cannot be solved efficiently or with stability using the limited precision of computers. Matrix decompositions are methods that reduce a matrix into constituent parts that make it easier to calculate more complex matrix operations. Matrix decomposition methods, also called matrix factorization methods, are a foundation of linear algebra in computers,even for basic operations such as solving systems of linear equations, calculating the inverse, and calculating the determinant of a matrix. In this tutorial, you will discover matrix decompositions and how to calculate them in Python. It is an approach that can simplify more complex matrix operations that can be performed on the decomposed matrix rather than on the original matrix itself. A common analogy for matrix decomposition is the factoring of numbers, such as the factoring of 10 into 2x5. For this reason, matrix decomposition is also called matrix factorization. Like fac...

Matrices and Matrix Arithmetic

Matrices and Matrix Arithmetic Matrices are a foundational element of linear algebra. Matrices are used throughout the fi eld of machine learning in the description of algorithms and processes such as the input data variable (X) when training an algorithm.A matrix is a two-dimensional array of scalars with one or more columns and one or more rows. Defining a Matrix We can represent a matrix in Python using a two-dimensional NumPy array. A NumPy array can be constructed given a list of lists. For example, below is a 2 row, 3 column matrix. # create matrix from numpy import array A = array([[1, 2, 3], [4, 5, 6]]) print(A) o/p: [[1 2 3] [4 5 6]] Matrix Addition # matrix addition from numpy import array # define first matrix A = array([[1, 2, 3],[4, 5, 6]]) print("Matrix A") print(A) # define second matrix B = array([[1, 2, 3],[4, 5, 6]]) print("Matrix B") print(B) # add matrices C = A + B print("Matrix C") print(C) o/p: Matrix A [[1...

Vectors and Vector Arithmetic,dot product,vector norm

Vectors are a foundational element of linear algebra. Vectors are used throughout the fi eld of machine learning in the description of algorithms and processes such as the target variable (y) when training an algorithm A vector is a tuple of one or more values called scalars.Vectors are often represented using a lowercase character such as v; for example: v = (v1; v2; v3) Where v1, v2, v3 are scalar values, often real values. Vectors are also shown using a vertical representation or a column; for example: v =v1       v2       v3 Defining a Vector We can represent a vector in Python as a NumPy array. A NumPy array can be created from a list of numbers. For example, below we defi ne a vector with the length of 3 and the integer values 1,2,3. from numpy import array # define vector v = array([1, 2, 3]) print(v) [1 2 3] Vector Arithmetic Vector Addition Two vectors of equal length can be added together to create a new third vect...

Linear Algebra and Machine Learning, the importance

Linear algebra is a field of mathematics that could be called the mathematics of data. It is undeniably a pillar in the field of machine learning, and many recommend it as a prerequisite subject to study prior to getting started in machine learning. I recommend a breadth- rst approach to getting started in applied machine learning. I call this approach a results- rst approach. It is where you start by learning and practicing the steps for working through a predictive modeling problem end-to-end (e.g. how to get results) with a tool (such as scikit-learn and Pandas in Python). This process then provides the skeleton and context for progressively deepening your knowledge, such as how algorithms work and eventually the math that underlies them. After you know how to work through a predictive modeling problem, let's look at why you should deepen your understanding of linear algebra. You should understand the following concepts thoroughly also how the...