TFlearn学习-regression

 1 import tflearn
 2 
 3 
 4 X = [3.3,4.4,5.5,6.71,6.93,4.168,9.779,6.182,7.59,2.167,7.042,10.791,5.313,7.997,5.654,9.27,3.1]
 5 Y = [1.7,2.76,2.09,3.19,1.694,1.573,3.366,2.596,2.53,1.221,2.827,3.465,1.65,2.904,2.42,2.94,1.3]
 6 
 7 
 8 input_ = tflearn.input_data(shape = [None])
 9 linear = tflearn.single_unit(input_)
10 #R2计算决定系数,用于评估线性回归
11 #optimizer有sgd,RMSprop,Adam,Momentum,AdaGrad,Ftrl,AdaDelta等
12 regression = tflearn.regression(linear, optimizer = 'sgd', loss = 'mean_square', metric = 'R2',
13                                 learning_rate = 0.01)
14 
15 
16 #使用DNN模型进行训练
17 model = tflearn.DNN(regression)
18 #n_epoch训练1000次(默认为10次),show_metric和snap_epoch为什么要这样还不清楚,尝试去掉之后RUN
19 model.fit(X, Y, n_epoch = 1000, show_metric=True, snapshot_epoch=False)
20 
21 
22 print('
Regression result:')
23 print('Y = ' + str(model.get_weights(linear.W)) + ' * X +' + str(model.get_weights(linear.b)))
24 
25 
26 print('
Test predication for x = 3.2, 3.3, 3.4:')
27 print(model.predict([3.2, 3.3, 3.4]))

训练结果为大致的值, 因为每次RUN之后拟合得到的直线的斜率都略微有点不一样.

model.fit函数可以参考Keras:http://keras-cn.readthedocs.io/en/latest/

原文地址:https://www.cnblogs.com/AlexHaiY/p/9318239.html