学习NLP,AI,Deep Learning 的牛逼的教程
2016-01-18 14:16
316 查看
1.Andrew Moore。卡内基梅隆计算机学院的院长大大。这些基本上涵盖了很多的数据挖掘topic。
Decision Trees
Information Gain
Probability for Data Miners
Probability Density Functions
Gaussians
Maximum Likelihood Estimation
Gaussian Bayes Classifiers
Cross-Validation
Neural Networks
Instance-based learning (aka Case-based or Memory-based or non-parametric)
Eight Regression Algorithms
Predicting Real-valued Outputs: An introduction to regression
Bayesian Networks
Inference in Bayesian Networks (by Scott Davies and Andrew Moore)
Learning Bayesian Networks
A Short Intro to Naive Bayesian Classifiers
Short Overview of Bayes Nets
Gaussian Mixture Models
K-means and Hierarchical Clustering
Hidden Markov Models
VC dimension
Support Vector Machines
PAC Learning
Markov Decision Processes
Reinforcement Learning
Biosurveillance: An example
Elementary probability and Naive Bayes classifiers
Spatial Surveillance
Time Series Methods
Game Tree Search Algorithms, including Alpha-Beta Search
Zero-Sum Game Theory
Non-zero-sum Game Theory
Introductory overview of time-series-based anomaly detection algorithms
AI Class introduction
Search Algorithms
A-star Heuristic Search
Constraint Satisfaction Algorithms, with applications in Computer Vision and Scheduling
Robot Motion Planning
HillClimbing, Simulated Annealing and Genetic Algorithms
2.
斯坦福大学在三月份开设了一门“深度学习与自然语言处理”的课程:CS224d: Deep Learning for Natural Language Processing,授课老师是青年才俊 Richard
Socher,他本人是德国人,大学期间涉足自然语言处理,在德国读研时又专攻计算机视觉,之后在斯坦福大学攻读博士学位,拜师NLP领域的巨牛 Chris
Manning 和 Deep Learning 领域的巨牛Andrew
Ng,其博士论文是《Recursive Deep Learning for Natural Language Processing
and Computer Vision》,也算是多年求学生涯的完美一击。毕业后以联合创始人及CTO的身份创办了MetaMind,作为AI领域的新星创业公司,MetaMind创办之初就拿了800万美元的风投,值得关注。
回到这们课程CS224d,其实可以翻译为“面向自然语言处理的深度学习(Deep Learning for Natural Language Processing)”,这门课程是面向斯坦福学生的校内课程,不过课程的相关材料都放到了网上,包括课程视频,课件,相关知识,预备知识,作业等等,相当齐备。课程大纲相当有章法和深度,从基础讲起,再讲到深度学习在NLP领域的具体应用,包括命名实体识别,机器翻译,句法分析器,情感分析等。Richard
Socher此前在ACL 2012和NAACL 2013 做过一个Tutorial,Deep
Learning for NLP (without Magic),感兴趣的同学可以先参考一下: Deep
Learning for NLP (without Magic) – ACL 2012 Tutorial – 相关视频及课件 。另外,由于这门课程的视频放在Youtube上,@爱可可-爱生活 老师维护了一个网盘链接:http://pan.baidu.com/s/1pJyrXaF ,同步更新相关资料,可以关注。
课程主页链接http://cs224d.stanford.edu/syllabus.html
3. coursera上面的Andrew NG的machine learning(https://www.coursera.org/learn/machine-learning)
以及
Geoffrey Hinton的nerual network used in Machine learning.(https://www.coursera.org/course/neuralnets)
Decision Trees
Information Gain
Probability for Data Miners
Probability Density Functions
Gaussians
Maximum Likelihood Estimation
Gaussian Bayes Classifiers
Cross-Validation
Neural Networks
Instance-based learning (aka Case-based or Memory-based or non-parametric)
Eight Regression Algorithms
Predicting Real-valued Outputs: An introduction to regression
Bayesian Networks
Inference in Bayesian Networks (by Scott Davies and Andrew Moore)
Learning Bayesian Networks
A Short Intro to Naive Bayesian Classifiers
Short Overview of Bayes Nets
Gaussian Mixture Models
K-means and Hierarchical Clustering
Hidden Markov Models
VC dimension
Support Vector Machines
PAC Learning
Markov Decision Processes
Reinforcement Learning
Biosurveillance: An example
Elementary probability and Naive Bayes classifiers
Spatial Surveillance
Time Series Methods
Game Tree Search Algorithms, including Alpha-Beta Search
Zero-Sum Game Theory
Non-zero-sum Game Theory
Introductory overview of time-series-based anomaly detection algorithms
AI Class introduction
Search Algorithms
A-star Heuristic Search
Constraint Satisfaction Algorithms, with applications in Computer Vision and Scheduling
Robot Motion Planning
HillClimbing, Simulated Annealing and Genetic Algorithms
2.
斯坦福大学在三月份开设了一门“深度学习与自然语言处理”的课程:CS224d: Deep Learning for Natural Language Processing,授课老师是青年才俊 Richard
Socher,他本人是德国人,大学期间涉足自然语言处理,在德国读研时又专攻计算机视觉,之后在斯坦福大学攻读博士学位,拜师NLP领域的巨牛 Chris
Manning 和 Deep Learning 领域的巨牛Andrew
Ng,其博士论文是《Recursive Deep Learning for Natural Language Processing
and Computer Vision》,也算是多年求学生涯的完美一击。毕业后以联合创始人及CTO的身份创办了MetaMind,作为AI领域的新星创业公司,MetaMind创办之初就拿了800万美元的风投,值得关注。
回到这们课程CS224d,其实可以翻译为“面向自然语言处理的深度学习(Deep Learning for Natural Language Processing)”,这门课程是面向斯坦福学生的校内课程,不过课程的相关材料都放到了网上,包括课程视频,课件,相关知识,预备知识,作业等等,相当齐备。课程大纲相当有章法和深度,从基础讲起,再讲到深度学习在NLP领域的具体应用,包括命名实体识别,机器翻译,句法分析器,情感分析等。Richard
Socher此前在ACL 2012和NAACL 2013 做过一个Tutorial,Deep
Learning for NLP (without Magic),感兴趣的同学可以先参考一下: Deep
Learning for NLP (without Magic) – ACL 2012 Tutorial – 相关视频及课件 。另外,由于这门课程的视频放在Youtube上,@爱可可-爱生活 老师维护了一个网盘链接:http://pan.baidu.com/s/1pJyrXaF ,同步更新相关资料,可以关注。
课程主页链接http://cs224d.stanford.edu/syllabus.html
Event | Date | Description | Course Materials | |
---|---|---|---|---|
Lecture | Mar 30 | Intro to NLP and Deep Learning | Suggested Readings: [Linear Algebra Review] [Probability Review] [Convex Optimization Review] [More Optimization (SGD) Review] [From Frequency to Meaning: Vector Space Models of Semantics] [Lecture Notes 1] [python tutorial] [slides] [video] | |
Lecture | Apr 1 | Simple Word Vector representations: word2vec, GloVe | Suggested Readings: [Distributed Representations of Words and Phrases and their Compositionality] [Efficient Estimation of Word Representations in Vector Space] [slides] [video] | |
Lecture | Apr 6 | Advanced word vector representations: language models, softmax, single layer networks | Suggested Readings: [GloVe: Global Vectors for Word Representation] [Improving Word Representations via Global Context and Multiple Word Prototypes] [Lecture Notes 2] [slides] [video] | |
Lecture | Apr 8 | Neural Networks and backpropagation -- for named entity recognition | Suggested Readings: [UFLDL tutorial] [Learning Representations by Backpropogating Errors] [slides] [video] | |
Lecture | Apr 13 | Project Advice, Neural Networks and Back-Prop (in full gory detail) | Suggested Readings: [Natural Language Processing (almost) from Scratch] [A Neural Network for Factoid Question Answering over Paragraphs] [Grounded Compositional Semantics for Finding and Describing Images with Sentences] [Deep Visual-Semantic Alignments for Generating Image Descriptions] [Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank] [(NEW) Lecture Notes 3] [slides] [video] | |
Lecture | Apr 15 | Practical tips: gradient checks, overfitting, regularization, activation functions, details | Suggested Readings: [Practical recommendations for gradient-based training of deep architectures] [UFLDL page on gradient checking] [slides] [video] | |
A1 Due | Apr 16 | Assignment #1 due | [Pset 1] | |
Lecture | Apr 20 | Recurrent neural networks -- for language modeling and other tasks | Suggested Readings: [Recurrent neural network based language model] [Extensions of recurrent neural network language model] [Opinion Mining with Deep Recurrent Neural Networks] [(NEW) Lecture Notes 4] [slides] [video] [minimal net example (karpathy)] [vanishing grad example] [vanishing grad notebook] | |
Proposal due | Apr 21 | Course Project Proposal due | [proposal description] | |
Lecture | Apr 22 | GRUs and LSTMs -- for machine translation | Suggested Readings: [Long Short-Term Memory] [Gated Feedback Recurrent Neural Networks] [Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling] [slides] [video] | |
Lecture | Apr 27 | Recursive neural networks -- for parsing | Suggested Readings: [Parsing with Compositional Vector Grammars] [Subgradient Methods for Structured Prediction] [Parsing Natural Scenes and Natural Language with Recursive Neural Networks] [slides] [video] | |
Lecture | Apr 29 | Recursive neural networks -- for different tasks (e.g. sentiment analysis) | Suggested Readings: [Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank] [Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection] [Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks] [slides] [video] | |
A2 Due | Apr 30 | Pset #2 Due date | [Pset #2] | |
Lecture | May 4 | Review Session for Midterm | Suggested Readings: N/A [slides] [video - see Piazza] | |
Midterm | May 6 | In-class midterm | ||
Lecture | May 11 | Guest Lecture with Jason Weston from Facebook: Neural Models with Memory -- for question answering | Suggested Readings: [Memory Networks] [Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks] [slides] [video] | |
Milestone | May 13 | Course Project Milestone | [milestone description] | |
Lecture | May 13 | Convolutional neural networks -- for sentence classification | Suggested Readings: [A Convolutional Neural Network for Modelling Sentences] [slides] [video] | |
Lecture | May 18 | Guest Lecture with Andrew Maas: Speech recognition | Suggested Readings: [ Deep Neural Networks for Acoustic Modeling in Speech Recognition] [slides] [video] | |
Lecture | May 20 | Guest Lecture with Elliot English: Efficient implementations and GPUs | Suggested Readings: [] [slides] [video] | |
A3 Due | May 21 | Pset #3 Due date | [Pset #3] | |
Lecture | May 27 | Applications of Deep Learning to Natural Language Processing | Suggested Readings: [] [slides] [video] | |
Lecture | Jun 1 | The future of Deep Learning for NLP: Dynamic Memory Networks | Suggested Readings: [Ask me anthing: Dynamic Memory Networks for NLP] [slides] [no video] | |
Poster Presentation | Jun 3 | Final project poster presentations: 2-5 pm, Gates patio | ||
Final Project Due | Jun 8 | Final course project due date | [project description] |
以及
Geoffrey Hinton的nerual network used in Machine learning.(https://www.coursera.org/course/neuralnets)
相关文章推荐
- Extjs4.0 最新最全视频教程
- OpenERP 的XML-RPC的实例+many2many,one2many,many2one...
- CSS3属性教程与案例分享
- jquery教程靠边站,一分钱不花让你免费学会jquery
- autoit入门教程小结第1/5页
- 用Photoshop 制作草地效果简明教程
- 比较完整简洁的Flash处理XML文档数据教程 上篇第1/3页
- VBS基础编程教程 (第1篇)
- SQLite教程(十一):临时文件
- VBS基础编程教程 (第3篇)
- VBS教程:运算符-运算符(+)
- PostgreSQL教程(十):性能提升技巧
- PostgreSQL教程(二):模式Schema详解
- PostgreSQL教程(十三):数据库管理详解
- PostgreSQL教程(八):索引详解
- PostgreSQL教程(三):表的继承和分区表详解
- XML简易教程之三
- 如何使用jquery easyui创建标签组件
- ruby 数组使用教程
- PostgreSQL教程(十九):SQL语言函数