您的位置:首页 > 大数据

大数据比赛(3)- 模型选择II

2016-05-10 10:02 323 查看

常用模型概述

神经网络与深度学习初步

基础

老规矩,先推文章:

手把手入门神经网络系列(1)_从初等数学的角度初探神经网络

/article/2275669.html

深度学习概述:从感知机到深度网络

/article/7083980.html

Deep Learning(深度学习)学习笔记整理系列

http://blog.csdn.net/zouxy09

相信深度学习大热的今天,大家对神经网络多少都有一些了解。在传统的“教科书”之外,寒小阳同学所写的博客从另一个容易理解的角度阐释了nn的结构,叫做“逻辑回归的逻辑回归的逻辑回归的逻辑……”

class NeuralNetworkForTianchi(Regressor):
def __init__(self, nodes, active_function='sigmoid', lr=.1, decay=1., off=1e-5,
batch=128, regular='l2', penalty=.5):
self.nodes, self.active_function = nodes, active_function
Regressor.__init__(self, None, lr, decay, off, batch, regular, penalty)

def _define_train_inputs(self):
return [T.matrix('x'), T.matrix('a'), T.matrix('b'), T.matrix('y')]

def _predict_outputs(self, *inputs):
y = inputs[0]
for i in range(len(self.nodes)-2):
y = self.layers['active_layer'].outs(self.layers['liner_layer'+str(i)].outs(y))
i = len(self.nodes)-2
y = self.layers['ReLU_layer'].outs(self.layers['liner_layer'+str(i)].outs(y))
return y

def _define_layers(self):
layers = {'active_layer': ActiveLayer(self.active_function), 'ReLU_layer': ActiveLayer('ReLU')}
for i in range(len(self.nodes)-1):
liner_layer = LinerLayer(self.nodes[i], self.nodes[i+1])
layers['liner_layer'+str(i)] = liner_layer
return layers

def _define_predict_inputs(self):
return [T.matrix('x')]

def _loss_function(self):
"""
loss function: mean_log_error, mean_squared_error, mean_absolute_error
:return: loss function
"""
y_regular = self._predict_outputs(self.train_inputs[0])
a, b, y = self.train_inputs[1:]
penalty = self._regular()
diff = y - y_regular
return T.mean((diff > 0) * diff * a - (diff < 0) * diff * b + penalty)
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: