您的位置:首页 > 其它

Machine Learning Week One

2016-11-29 20:57 309 查看
Keep updating

This passage summarize the knowledge in web

https://www.coursera.org/learn/machine-learning/

Supervised learning

In supervised learning, we are given a data set and already know what our correct output should look like, having the idea that there is a relationship between the input and the output.

Supervised learning problems are categorized into “regression” and “classification” problems.

regression problem

map input variables to some continuous function

classification problem

map input variables into discrete categories

Unsupervised learning

Unsupervised learning allows us to approach problems with little or no idea what our results should look like. We can derive structure from data where we don’t necessarily know the effect of the variables.

notation

x(i) denotes the i th input

y(i) denotes the i th output

A pair (x(i),y(i)) denotes a training example

m denotes the number of training examples

X denote the space of input values

Y denotes the space of output values

h(x) denote a function(predictor) that maps x to y

our goal is find the proper θ0 and θ1that satisfies

min(θ0,θ1)12m∑i=0m(hθ(x(i))−y(i))2

Gradient Descent Algorithm

θj:=θj−α∂∂θjJ(θ0,θ1,...,θj)

where αis the learning rate deciding the size of each step

and where ∂means partial derivative

we should simultaneously update theta0 and theta1 at each iteration.

Typically h(x) can be in the form of hθ(x1,...,xi)=θ1x1+...+θixi+θ0

Batch Gradient Descend

The method looks at every example in the entire training set on every step.
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: