$ Entropy H(X) = -sum p(X)log p(X) $
$ Information Gain I(X,Y)= H(X)-H(X|Y) $
$ pi $ = 3.1415926..
参考:
博客园官方说明:https://www.cnblogs.com/cmt/p/3279312.html
在线LaTeX编辑器:https://www.codecogs.com/latex/eqneditor.php?lang=zh-cn
$ Entropy H(X) = -sum p(X)log p(X) $
$ Information Gain I(X,Y)= H(X)-H(X|Y) $
$ pi $ = 3.1415926..
参考:
博客园官方说明:https://www.cnblogs.com/cmt/p/3279312.html
在线LaTeX编辑器:https://www.codecogs.com/latex/eqneditor.php?lang=zh-cn