您的位置:首页 > 大数据

梯度下降算法、随机梯度下降算法scala实现

2016-07-04 20:33 316 查看
梯度下降和随机梯度下降是机器学习中最常用的算法之一。关于其具体的原理这里不多做介绍,网络上可以很方便的找到。例如可以参考博客:http://blog.csdn.net/woxincd/article/details/7040944

scala代码实现如下:

object SGD{
/*X:输入变量
y:输入结果
learnRate:学习步长
iterNum:迭代次数
thres:损失函数阈值
*/
def gradientDescent(X:Array[Array[Int]],y:Array[Int],learnRate:Double = 0.001,iterNum:Int = 1000,thres :Double = 0.0001):Array[Double]={
val theta:Array[Double] = new Array(X(0).length)
var loss = 10000.0
for(i<- 0 to iterNum if loss > thres){
var errorSum = 0.0
for(row<-0 until X.length){
var rowSum = 0.0
for(col <- 0 until X(0).length){
rowSum += X(row)(col)*theta(col)
}
errorSum += y(row)-rowSum
for(col <- 0 until X(0).length){
theta(col) += learnRate*errorSum*X(row)(col)
}
}
loss = 0.0
for(row <- 0 until X.length){
var rowSum= 0.0
for(col <- 0 until X(0).length){
rowSum += X(row)(col)*theta(col)
}
loss += (rowSum - result(row))*(rowSum - result(row))
}
}
theta
}

def stochasticGradientDescent(X:Array[Array[Int]],y:Array[Int],learnRate:Double = 0.001,iterNum:Int = 1000,thres :Double = 0.0001):Array[Double]={
var theta:Array[Double] = new Array(X(0).length)
var loss = 10000.0
for(i<- 0 to iterNum if loss > thres){
var errorSum = 0.0
val row = i % X.length
var rowSum = 0.0
for(col <- 0 until X(0).length){
rowSum += X(row)(col)*theta(col)
}
errorSum += y(row)-rowSum
for(col <- 0 until X(0).length){
theta(col) += learnRate*errorSum*X(row)(col)
}
loss = 0.0
for(row <- 0 until X.length){
var rowSum = 0.0
for(col <- 0 until X(0).length){
rowSum += X(row)(col)*theta(col)
}
loss += (rowSum - result(row))*(rowSum - result(row))
}
}
theta
}
}
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息