Machine learning lecture2 note

Lecture 2
  linear regression,gradient descent and the normal equations(正规方程组?)
regression problem:continuous value
m: the number of raining example
X: input variables(features)
Y: target variable
(X,Y) is called one training example
for linear regression, h_{\theta}(x)=\theta_0+\theta_1 x_1+\theta_2 x_2+\cdots
"h" is called parameter, aim: use the training set to get the proper parameter
minimize {1\over2}\sum_{i=1}^n {(h_{\theta}(x^i)-y^i)^2}
one problem of gradient descent is that, a slightly alternate of the starting point may make you end up with a totally different local optimization.
原文地址:https://www.cnblogs.com/evelyn/p/4692241.html