# Machine Learning - III. Linear Algebra Review (Week 1, Optional)

what are matrices矩阵

matrix is just another way for saying, is a 2D or a two dimensional array.

dimension of the matrix
is going to be written as the number of row times the number of columns in the matrix.

written out as R4 by 2 or concretely what people will sometimes say this matrix is an element of the set R 4 by 2.

matrix elements,(entries of matrix) the numbers inside the matrix.

the matrix gets you a way of letting you quickly organize, index and access lots of data.

what are vectors向量

A vector turns out to be a special case of a matrix.A vector is a matrix that has only 1 column so you have an N x 1 matrix.{本course中的vector都是列向量}

dimension:if have N equals four elements here.so we also call this is a four dimensional vector, just means that this is a vector with four elements, with four numbers in it.

refer to this as a vector in the set R4.

Notation关于符号的规范表示:

throughout the rest of these videos on linear algebra review, I will be using one index vectors.课程中大多向量下标都是从1开始。

when talking about machine learning applications, sometimes explicitly say when we need to switch to, when we need to use the zero index vectors as well.讨论机器学习应用时会转换到下标从0开始。

Finally, by convention,use upper case to refer to matrices.So we‘re going to use capital letters like A, B, C.and usually we‘ll use lowercase,like a, b, x, y,to refer to either numbers,or just raw numbers or scalars or to vectors.

from:

## ON THE EVOLUTION OF MACHINE LEARNING: FROM LINEAR MODELS TO NEURAL NETWORKS

ON THE EVOLUTION OF MACHINE LEARNING: FROM LINEAR MODELS TO NEURAL NETWORKS We recently interviewed Reza Zadeh (@Reza_Zadeh). Reza is a Consulting Professor in the Institute for Computational and Mathematical Engineering at Stanford University and a

## 【Stanford Open Courses】Machine Learning：Linear Regression with One Variable (Week 1) ## Machine Learning - II. Linear Regression with One Variable (Week 1) http://blog.csdn.net/pipisorry/article/details/43115525 机器学习Machine Learning - Andrew NG courses学习笔记 单变量线性回归Linear regression with one variable 模型表示Model representation 例子: 这是Regression Problem(one of supervised learning)并且是Univariate linear regressi

## Machine Learning：Linear Regression With Multiple Variables Machine Learning:Linear Regression With Multiple Variables 接着上次预测房子售价的例子,引出多变量的线性回归. 在这里我们用向量的表示方法使表达式更加简洁. 变量梯度下降跟单变量一样需同步更新所有的theta值. 进行feature scaling的原因是为了使gradient descent算法收敛速度加快.如下图所示,左图theta2与theta1的量级相差太大,这样导致Cost Function的等高图为一个细高的椭圆形状,可以看到

## Machine Learning：Linear Regression With One Variable Machine Learning:Linear Regression With One Variable 机器学习可以应用于计算机视觉,自然语言处理,数据挖掘等领域,可以分为监督学习(Supervised Learning),无监督学习(Unsupervised Learning),强化学习(Reinforcement Learning)等. 首先我们从一个简单的监督学习入手:假如给我们一组训练集(在这里就是Size和Price),我们如何才能建立一个可以预测房价的模型呢? 这里(x,y)称为一

## Machine Learning - IV. Linear Regression with Multiple Variables (Week 2) http://blog.csdn.net/pipisorry/article/details/43529845 机器学习Machine Learning - Andrew NG courses学习笔记 multivariate linear regression多变量线性规划 (linear regression works with multiple variables or with multiple features) Multiple Features(variables)多特征(变量)