机器学习基石笔记 Lecture 3 - Types of Learning
2015-12-19 20:46
429 查看
Lecture 3 - Types of Learning
Learning with Different Output Space Y
binary classification
core and important problem with many tools as building block of other toolsMulticlass Classification
many applications in practice,especially for ‘recognition’Regression
also core and important with many ‘statistical’tools as building block of other toolsStructured Learning
a fancy but complicated learning problem可以看作大规模的多分类问题,但是没有明确的类定义
Learning with Different Data Label yn
Supervised
every xn comes with corresponding ynUnsupervised
Learning without ynunsupervised multiclass classification ⟺‘clustering’
Semi-supervised
semi-supervised learning: leverage unlabeled data to avoid ‘expensive’ labeling由于标记成本比较高,或者说根本就没有这么多标记
Reinforcement Learning
a ‘very different’ but natural way of learning reinforcement: learn with ‘partial/implicit information’ (often sequentially)训练机器,好比训练一条狗,哈哈,好好玩
增强学习我了解的太少了,具体怎么反馈的??
Learning with Different Protocol f⇒(xn,yn)
Batch Learning
a very common protocol,learn from all known dataOnline
最开始一点数据也不要Active Learning
Learning by ‘Asking’,相当于我们高中自习的时候,有问题问老师improve hypothesis with fewer labels (hopefully) by asking questions strategically
A photographer has 100, 000 pictures, each containing one baseball
player. He wants to automatically categorize the pictures by its player inside. He starts by categorizing 1, 000 pictures by himself, and then writes an algorithm that tries to categorize the other pictures if it is ‘confident’ on the category while pausing for (& learning from) human input if not. What protocol best describes the nature of the algorithm?
Learning with Different Input Space X
对人来说,越抽象的特征,越难理解,对于机器来说,也是越难学习concrete features
each dimension of X⊆Rd represents ‘sophisticated physical meaning’,the ‘easy’ ones for MLMore on Concrete Features:
Raw Features
image pixels, speech signal, etc.often need human or machinesto convert to concrete ones
Abstract Features
again need ‘feature conversion/extraction/construction’fun time
相关文章推荐
- 机器学习基石笔记 Lecture 1: The Learning Problem
- 机器学习基石笔记 Lecture 2: Learning to Answer Yes/No
- 机器学习基石 8.4 Weighted Classification
- 机器学习基石 8.3 Algorithmic Error Measure
- 机器学习基石 8.2 Error Measure
- 8.1 Noise and Probabilistic Target
- 机器学习基石 7.4 Interpreting VC Dimension
- 机器学习基石 7.3 Physical Intuition of VC Dimension
- 机器学习基石 7.2 VC Dimension of Perceptrons
- 机器学习基石 7.1 Definition of VC Dimension
- 机器学习基石 6.4 A Pictorial Proof
- 机器学习基石 6.3 Bounding Function: Inductive Cases
- 机器学习基石 6.2 Bounding Function: Basic Cases
- 机器学习基石 6.1 Restriction of Break Point
- 机器学习基石 5.4 Break Point
- 机器学习基石 5.3 Effective Number of Hypotheses
- 机器学习基石 5.2 Effective Number of Lines
- 机器学习基石 5.1 Recap and Preview
- 机器学习基石 4.5 作业一
- 机器学习基石 4.4 Connection to Real Learning