Machine Learning Week One
2016-11-29 20:57
309 查看
Keep updating
This passage summarize the knowledge in web
https://www.coursera.org/learn/machine-learning/
Supervised learning
In supervised learning, we are given a data set and already know what our correct output should look like, having the idea that there is a relationship between the input and the output.
Supervised learning problems are categorized into “regression” and “classification” problems.
regression problem
map input variables to some continuous function
classification problem
map input variables into discrete categories
Unsupervised learning
Unsupervised learning allows us to approach problems with little or no idea what our results should look like. We can derive structure from data where we don’t necessarily know the effect of the variables.
notation
x(i) denotes the i th input
y(i) denotes the i th output
A pair (x(i),y(i)) denotes a training example
m denotes the number of training examples
X denote the space of input values
Y denotes the space of output values
h(x) denote a function(predictor) that maps x to y
our goal is find the proper θ0 and θ1that satisfies
min(θ0,θ1)12m∑i=0m(hθ(x(i))−y(i))2
Gradient Descent Algorithm
θj:=θj−α∂∂θjJ(θ0,θ1,...,θj)
where αis the learning rate deciding the size of each step
and where ∂means partial derivative
we should simultaneously update theta0 and theta1 at each iteration.
Typically h(x) can be in the form of hθ(x1,...,xi)=θ1x1+...+θixi+θ0
Batch Gradient Descend
The method looks at every example in the entire training set on every step.
This passage summarize the knowledge in web
https://www.coursera.org/learn/machine-learning/
Supervised learning
In supervised learning, we are given a data set and already know what our correct output should look like, having the idea that there is a relationship between the input and the output.
Supervised learning problems are categorized into “regression” and “classification” problems.
regression problem
map input variables to some continuous function
classification problem
map input variables into discrete categories
Unsupervised learning
Unsupervised learning allows us to approach problems with little or no idea what our results should look like. We can derive structure from data where we don’t necessarily know the effect of the variables.
notation
x(i) denotes the i th input
y(i) denotes the i th output
A pair (x(i),y(i)) denotes a training example
m denotes the number of training examples
X denote the space of input values
Y denotes the space of output values
h(x) denote a function(predictor) that maps x to y
our goal is find the proper θ0 and θ1that satisfies
min(θ0,θ1)12m∑i=0m(hθ(x(i))−y(i))2
Gradient Descent Algorithm
θj:=θj−α∂∂θjJ(θ0,θ1,...,θj)
where αis the learning rate deciding the size of each step
and where ∂means partial derivative
we should simultaneously update theta0 and theta1 at each iteration.
Typically h(x) can be in the form of hθ(x1,...,xi)=θ1x1+...+θixi+θ0
Batch Gradient Descend
The method looks at every example in the entire training set on every step.
相关文章推荐
- Coursera Machine Learning Week 1.2: Linear Regression.one variable
- Machine Learning week 4 programming exercise One vs All and Neural network
- Machine Learning week 1 quiz: Linear Regression with One Variable
- Machine Learning week 1 quiz: Linear Regression with One Variable
- Machine Learning Week_1
- Machine Learning Notebook - Week 1 - Andrew Ng
- (原创)Stanford Machine Learning (by Andrew NG) --- (week 8) Clustering & Dimensionality Reduction
- Machine Learning week 8 Dimensionality Reduction
- Machine Learning Week 2 ex1
- Machine Learning - XI. Machine Learning System Design机器学习系统设计(Week 6)
- Course 3-Structuring Machine Learning Projects--Week 1
- Machine Learning week 4 quiz: Neural Networks: Representation
- Coursera Machine Learning Week 1.1: Introduction
- Machine Learning week 3 quiz: programming assignment-Logistic Regression
- Machine Learning week 6 quiz: programming assignment-Regularized Linear Regression and Bias/Variance
- Machine Learning week 7 quiz: programming assignment-Support Vector Machines
- Coursera Machine Learning Week 3.2: Regularization
- Machine Learning week 9 quiz: programming assignment-Anomaly Detection and Recommender Systems
- Machine Learning - XI. Machine Learning System Design机器学习系统设计(Week 6)系统评估标准
- Machine Learning week5