coursera 吴恩达机器学习第三次作业提交

仅贴添加部分

lrCostFunction.m

hx=sigmoid(X*theta);
J=-(y'*log(hx)+(1-y)'*log(1-hx))/m+lambda/(2*m)*sum(theta(2:end,:).^2);
theta(1)=0;
grad=1/m*X'*(hx-y)+lambda/m*theta;

oneVsAll.m

options=optimset('GradObj','on','MaxIter',50);
for k=1:num_labels
initial_theta=zeros(n+1,1);
theta=fmincg(@(t)(lrCostFunction(t,X,(y==k),lambda)),initial_theta,options);
all_theta(k,:)=theta';

predictOneVsAll.m

[c,i]=max(sigmoid(X*all_theta'),[],2);
p=i;

predict.m

a1=[ones(m,1) X];
z2=Theta1*a1';
a2=[ones(1,m);sigmoid(z2)];
z3=Theta2*a2;
output=z3';
[c,i]=max(output,[],2);
p=i;
原文地址:https://www.cnblogs.com/lxb0478/p/8317309.html