tensorflow2知识总结---4、逻辑回归实例

tensorflow2知识总结---4、逻辑回归实例

一、总结

一句话总结:

也就是将损失函数设置为binary_crossentropy即可:model.compile(optimizer='adam',loss='binary_crossentropy',metrics=['acc'])

1、把tensorflow2的history对象的epoch和损失用图的方式展示出来?

plt.plot(history.epoch,history.history.get('loss'))

# 把epoch当横左边,把loss当纵坐标

二、逻辑回归实例

博客对应课程的视频位置:

In [1]:
import tensorflow as tf
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
In [2]:
# 表示没有表头,如果不设置header=None,则第一行数据会变成表头
data = pd.read_csv('dataset/credit-a.csv',header=None)
data
Out[2]:
 0123456789101112131415
0 0 30.83 0.000 0 0 9 0 1.25 0 0 1 1 0 202 0.0 -1
1 1 58.67 4.460 0 0 8 1 3.04 0 0 6 1 0 43 560.0 -1
2 1 24.50 0.500 0 0 8 1 1.50 0 1 0 1 0 280 824.0 -1
3 0 27.83 1.540 0 0 9 0 3.75 0 0 5 0 0 100 3.0 -1
4 0 20.17 5.625 0 0 9 0 1.71 0 1 0 1 2 120 0.0 -1
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
648 0 21.08 10.085 1 1 11 1 1.25 1 1 0 1 0 260 0.0 1
649 1 22.67 0.750 0 0 0 0 2.00 1 0 2 0 0 200 394.0 1
650 1 25.25 13.500 1 1 13 7 2.00 1 0 1 0 0 200 1.0 1
651 0 17.92 0.205 0 0 12 0 0.04 1 1 0 1 0 280 750.0 1
652 0 35.00 3.375 0 0 0 1 8.29 1 1 0 0 0 0 0.0 1

653 rows × 16 columns

0-14列是特征,第15列是目标值

In [3]:
# 前5行
data.head()
Out[3]:
 0123456789101112131415
0 0 30.83 0.000 0 0 9 0 1.25 0 0 1 1 0 202 0.0 -1
1 1 58.67 4.460 0 0 8 1 3.04 0 0 6 1 0 43 560.0 -1
2 1 24.50 0.500 0 0 8 1 1.50 0 1 0 1 0 280 824.0 -1
3 0 27.83 1.540 0 0 9 0 3.75 0 0 5 0 0 100 3.0 -1
4 0 20.17 5.625 0 0 9 0 1.71 0 1 0 1 2 120 0.0 -1
In [4]:
data.iloc[:,-1].value_counts()
Out[4]:
 1    357
-1    296
Name: 15, dtype: int64
In [5]:
# 逗号前面表示取所有行,逗号后面表示取开头到-1列
x = data.iloc[:,:-1]
# 把-1用0来表示
y = data.iloc[:,-1].replace(-1,0)
# 1和-1适合支持向量机算法
In [6]:
x
Out[6]:
 01234567891011121314
0 0 30.83 0.000 0 0 9 0 1.25 0 0 1 1 0 202 0.0
1 1 58.67 4.460 0 0 8 1 3.04 0 0 6 1 0 43 560.0
2 1 24.50 0.500 0 0 8 1 1.50 0 1 0 1 0 280 824.0
3 0 27.83 1.540 0 0 9 0 3.75 0 0 5 0 0 100 3.0
4 0 20.17 5.625 0 0 9 0 1.71 0 1 0 1 2 120 0.0
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
648 0 21.08 10.085 1 1 11 1 1.25 1 1 0 1 0 260 0.0
649 1 22.67 0.750 0 0 0 0 2.00 1 0 2 0 0 200 394.0
650 1 25.25 13.500 1 1 13 7 2.00 1 0 1 0 0 200 1.0
651 0 17.92 0.205 0 0 12 0 0.04 1 1 0 1 0 280 750.0
652 0 35.00 3.375 0 0 0 1 8.29 1 1 0 0 0 0 0.0

653 rows × 15 columns

In [7]:
y
Out[7]:
0      0
1      0
2      0
3      0
4      0
      ..
648    1
649    1
650    1
651    1
652    1
Name: 15, Length: 653, dtype: int64
In [8]:
model = tf.keras.Sequential()
# 中间4个神经元
model.add(tf.keras.layers.Dense(4,input_shape=(15,),activation='relu'))
# 第二层不需要告诉input_shape,他会自己推断
# 第一个隐藏层
model.add(tf.keras.layers.Dense(4,activation='relu'))
# 输出层
model.add(tf.keras.layers.Dense(1,activation='sigmoid'))

model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 4)                 64        
_________________________________________________________________
dense_1 (Dense)              (None, 4)                 20        
_________________________________________________________________
dense_2 (Dense)              (None, 1)                 5         
=================================================================
Total params: 89
Trainable params: 89
Non-trainable params: 0
_________________________________________________________________

第一层64个节点 等于 15*4+4

第二层 20=4*4+4

In [9]:
# metrics=['acc'] 每次运算完毕,计算正确率
model.compile(optimizer='adam',loss='binary_crossentropy',metrics=['acc'])

history = model.fit(x,y,epochs=100) #epochs表示训练的次数
Epoch 1/100
21/21 [==============================] - 0s 712us/step - loss: 125.1340 - acc: 0.5283
Epoch 2/100
21/21 [==============================] - 0s 665us/step - loss: 73.5050 - acc: 0.5345
Epoch 3/100
21/21 [==============================] - 0s 714us/step - loss: 23.0695 - acc: 0.5865
Epoch 4/100
21/21 [==============================] - 0s 760us/step - loss: 3.7062 - acc: 0.6799
Epoch 5/100
21/21 [==============================] - 0s 807us/step - loss: 3.6635 - acc: 0.6738
Epoch 6/100
21/21 [==============================] - 0s 950us/step - loss: 3.1660 - acc: 0.6662
Epoch 7/100
21/21 [==============================] - 0s 855us/step - loss: 2.6206 - acc: 0.6662
......
Epoch 95/100
21/21 [==============================] - 0s 760us/step - loss: 0.4830 - acc: 0.7948
Epoch 96/100
21/21 [==============================] - 0s 807us/step - loss: 0.6317 - acc: 0.7887
Epoch 97/100
21/21 [==============================] - 0s 712us/step - loss: 0.4358 - acc: 0.8086
Epoch 98/100
21/21 [==============================] - 0s 760us/step - loss: 0.4004 - acc: 0.8300
Epoch 99/100
21/21 [==============================] - 0s 712us/step - loss: 0.3916 - acc: 0.8423
Epoch 100/100
21/21 [==============================] - 0s 665us/step - loss: 0.3911 - acc: 0.8438
In [10]:
history.history.keys()
Out[10]:
dict_keys(['loss', 'acc'])
In [11]:
# 把epoch当横左边,把loss当纵坐标
plt.plot(history.epoch,history.history.get('loss'))
Out[11]:
[<matplotlib.lines.Line2D at 0x210dcf57908>]
In [12]:
plt.plot(history.epoch,history.history.get('acc'))
Out[12]:
[<matplotlib.lines.Line2D at 0x210de790b48>]
In [ ]:
 
 
我的旨在学过的东西不再忘记(主要使用艾宾浩斯遗忘曲线算法及其它智能学习复习算法)的偏公益性质的完全免费的编程视频学习网站: fanrenyi.com;有各种前端、后端、算法、大数据、人工智能等课程。
博主25岁,前端后端算法大数据人工智能都有兴趣。
大家有啥都可以加博主联系方式(qq404006308,微信fan404006308)互相交流。工作、生活、心境,可以互相启迪。
聊技术,交朋友,修心境,qq404006308,微信fan404006308
26岁,真心找女朋友,非诚勿扰,微信fan404006308,qq404006308
人工智能群:939687837

作者相关推荐

原文地址:https://www.cnblogs.com/Renyi-Fan/p/13352713.html