一元线性回归例题

一元线性回归例题 - 餐馆扩建

IIn this part of this exercise, you will implement linear regression with one variable to predict profits for a food truck. Suppose you are the CEO of a restaurant franchise and are considering different cities for opening a new outlet. The chain already has trucks in various cities and you have data for profits and populations from the cities.You would like to use this data to help you select which city to expand to next.

1. 选择模型

进行拟合的第一步应该观察样本,选择合适的模型进行拟合。

首先载入训练集,将训练集分为X、Y两个集合。

data = load('~/Desktop/ex1data1.txt');
X = data(:,1);
y = data(:,2);
m = length(y);   % 样本数量

然后绘制训练集的图像,用来选择模型。这里新建一个函数用来绘制训练集的图像

function plotData(x,y)
figure;   % 新建画布
plot(x,y,'ks','MarkerSize',6);
title("Training set");
xlabel("Population of City in 10,000s");
ylabel("Profit in $10,000s");
end

调用plotData后得到函数图像如下

fig1

根据训练集图像,选择一元线性模型进行拟合。

2. 梯度下降

确定拟合模型后,采用梯度下降法进行参数拟合。

X = [ones(m,1), data(:,1)];   % 增加 x0 属性
theta = zeros(2,1);   % 根据训练集图像初始化参数为[2,1]
iterations = 1500;
alpha = 0.01;

这里单独将梯度下降的过程封装成函数

function theta = gradientDescent(X,y,theta,alpha,num_iters)
m = length(y);
for iter = 1:num_iters
	theta = theta - alpha * 1/m * X' * (X * theta - y);   % 向量化计算
end
end

3. 绘制图像

在主函数中调用 gradientDescent 函数进行拟合,并绘制最终的(h_ heta(x))

theta = gradientDescent(X,y,theta,alpha,iterations);
fig3

还可以绘制(J( heta))的三维图像,(J( heta))的等高线来进一步理解梯度下降。

function J = computeCost(X,y,theta)
	m = length(y);
	J = 0;
	p = X * theta - y;
	J = 1/(2*m) * sum(p .* p);
end
% h(x)
testx = 1:1:30;
testy = theta(1) + theta(2) * testx;
hold on;
plot(testx,testy);
hold off;

% Grid over which we will calculate J
theta0_vals = linspace(-10, 10, 100);
theta1_vals = linspace(-1, 4, 100);

% initialize J_vals to a matrix of 0's
J_vals = zeros(length(theta0_vals), length(theta1_vals));

% Fill out J_vals
for i = 1:length(theta0_vals)
    for j = 1:length(theta1_vals)
	  t = [theta0_vals(i); theta1_vals(j)];
	  J_vals(i,j) = computeCost(X, y, t);
    end
end


% Because of the way meshgrids work in the surf command, we need to
% transpose J_vals before calling surf, or else the axes will be flipped
J_vals = J_vals';
% Surface plot
figure;
surf(theta0_vals, theta1_vals, J_vals)
xlabel('	heta_0'); ylabel('	heta_1');

% Contour plot
figure;
% Plot J_vals as 15 contours spaced logarithmically between 0.01 and 100
contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 3, 20))
xlabel('	heta_0'); ylabel('	heta_1');
hold on;
plot(theta(1), theta(2), 'rx', 'MarkerSize', 10, 'LineWidth', 2);
fig3 fig4

4. 补充技巧:特征归一化

function [X_norm, mu, sigma] = featureNormalize(X)
%FEATURENORMALIZE Normalizes the features in X 
%   FEATURENORMALIZE(X) returns a normalized version of X where
%   the mean value of each feature is 0 and the standard deviation
%   is 1. This is often a good preprocessing step to do when
%   working with learning algorithms.

% You need to set these values correctly
X_norm = X;
mu = zeros(1, size(X, 2));
sigma = zeros(1, size(X, 2));

% ====================== YOUR CODE HERE ======================
% Instructions: First, for each feature dimension, compute the mean
%               of the feature and subtract it from the dataset,
%               storing the mean value in mu. Next, compute the 
%               standard deviation of each feature and divide
%               each feature by it's standard deviation, storing
%               the standard deviation in sigma. 
%
%               Note that X is a matrix where each column is a 
%               feature and each row is an example. You need 
%               to perform the normalization separately for 
%               each feature. 
mu = mean(X); % X:m*2 , mu: 2 vector mu(1) : X_1) mu(2) : X_2
sigma = std(X);
X_norm(:,1) = (X_norm(:,1) - mu(1)) / sigma(1);
X_norm(:,2) = (X_norm(:,2) - mu(2)) / sigma(2);
% Hint: You might find the 'mean' and 'std' functions useful.
%       
% =========================================================

end

[X, mu, sigma] = featureNormalize(X);
---- suffer now and live the rest of your life as a champion ----
原文地址:https://www.cnblogs.com/popodynasty/p/13684334.html