您的位置:首页 > 理论基础 > 计算机网络

【Python-ML】神经网络激励函数-Softmax

2018-01-27 09:07 459 查看
# -*- coding: utf-8 -*-
'''
Created on 2018年1月27日
@author: Jason.F
@summary: 前馈神经网络激励函数-softmax函数,评估多类别分类任务中的类别概率
'''
import numpy as np
import time

if __name__ == "__main__":
start = time.clock()

def net_input(X,w):
z=X.dot(w)
return z
def softmax(z):
return np.exp(z)/np.sum(np.exp(z))
def softmax_activation(X,w):
z=net_input(X,w)
return softmax(z)
#W:array,shape=[n_output_units,n_hidden_units+1],weight matrix for hidden layer --> output layer
#note that first column (A[:][0]=1) are the bias units.
W=np.array([[1.1,1.2,1.3,0.5],[0.1,0.2,0.4,0.1],[0.2,0.5,2.1,1.9]])
#A:array,shape=[n_hiddern+1,n_samples],Activation of hidden layer.
#note that first element (A[0][0]=1) is the bias unit.
A=np.array([[1.0],[0.1],[0.3],[0.7]])
#Z:array,shape=[n_output_units,n_samples],Net input of the output layer.
Z=W.dot(A)
y_probas = softmax(Z)
print ('Probabilities:\n',y_probas)
print (y_probas.sum())
y_class = np.argmax(Z,axis=0)
print ('predicted class label:%d'%y_class[0])

end = time.clock()
print('finish all in %s' % str(end - start))


结果:

('Probabilities:\n', array([[ 0.40386493],
[ 0.07756222],
[ 0.51857284]]))
1.0
predicted class label:2
finish all in 0.00170994801643
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: