您的位置:首页 > 大数据 > 人工智能

学习NLP,AI,Deep Learning 的牛逼的教程

2016-01-18 14:16 316 查看
1.Andrew Moore。卡内基梅隆计算机学院的院长大大。这些基本上涵盖了很多的数据挖掘topic。

Decision Trees
Information Gain
Probability for Data Miners
Probability Density Functions
Gaussians
Maximum Likelihood Estimation
Gaussian Bayes Classifiers
Cross-Validation
Neural Networks
Instance-based learning (aka Case-based or Memory-based or non-parametric)
Eight Regression Algorithms
Predicting Real-valued Outputs: An introduction to regression
Bayesian Networks
Inference in Bayesian Networks (by Scott Davies and Andrew Moore)
Learning Bayesian Networks
A Short Intro to Naive Bayesian Classifiers
Short Overview of Bayes Nets
Gaussian Mixture Models
K-means and Hierarchical Clustering
Hidden Markov Models
VC dimension
Support Vector Machines
PAC Learning
Markov Decision Processes
Reinforcement Learning
Biosurveillance: An example
Elementary probability and Naive Bayes classifiers
Spatial Surveillance
Time Series Methods
Game Tree Search Algorithms, including Alpha-Beta Search
Zero-Sum Game Theory
Non-zero-sum Game Theory
Introductory overview of time-series-based anomaly detection algorithms
AI Class introduction
Search Algorithms
A-star Heuristic Search
Constraint Satisfaction Algorithms, with applications in Computer Vision and Scheduling
Robot Motion Planning
HillClimbing, Simulated Annealing and Genetic Algorithms

2. 

斯坦福大学在三月份开设了一门“深度学习与自然语言处理”的课程:CS224d: Deep Learning for Natural Language Processing,授课老师是青年才俊 Richard
Socher,他本人是德国人,大学期间涉足自然语言处理,在德国读研时又专攻计算机视觉,之后在斯坦福大学攻读博士学位,拜师NLP领域的巨牛 Chris
Manning 和 Deep Learning 领域的巨牛Andrew
Ng,其博士论文是《Recursive Deep Learning for Natural Language Processing
and Computer Vision》,也算是多年求学生涯的完美一击。毕业后以联合创始人及CTO的身份创办了MetaMind,作为AI领域的新星创业公司,MetaMind创办之初就拿了800万美元的风投,值得关注。

回到这们课程CS224d,其实可以翻译为“面向自然语言处理的深度学习(Deep Learning for Natural Language Processing)”,这门课程是面向斯坦福学生的校内课程,不过课程的相关材料都放到了网上,包括课程视频,课件,相关知识,预备知识,作业等等,相当齐备。课程大纲相当有章法和深度,从基础讲起,再讲到深度学习在NLP领域的具体应用,包括命名实体识别,机器翻译,句法分析器,情感分析等。Richard
Socher此前在ACL 2012和NAACL 2013 做过一个Tutorial,Deep
Learning for NLP (without Magic),感兴趣的同学可以先参考一下: Deep
Learning for NLP (without Magic) – ACL 2012 Tutorial – 相关视频及课件 。另外,由于这门课程的视频放在Youtube上,@爱可可-爱生活 老师维护了一个网盘链接:http://pan.baidu.com/s/1pJyrXaF ,同步更新相关资料,可以关注。
课程主页链接http://cs224d.stanford.edu/syllabus.html

EventDateDescriptionCourse Materials
LectureMar 30Intro to NLP and Deep LearningSuggested Readings:
[Linear Algebra Review]
[Probability Review]
[Convex Optimization Review]
[More Optimization (SGD) Review]
[From Frequency to Meaning: Vector Space Models of Semantics]
[Lecture Notes 1

[python tutorial] [slides]
[video]
LectureApr 1Simple Word Vector representations: word2vec, GloVeSuggested Readings:
[Distributed Representations of Words and Phrases and
their Compositionality]
[Efficient Estimation of Word Representations in Vector Space]
[slides] [video]
LectureApr 6Advanced word vector representations: language models, softmax, single layer networksSuggested Readings:
[GloVe: Global Vectors for Word Representation]
[Improving Word Representations via Global Context and Multiple Word Prototypes]
[Lecture Notes 2

[slides] [video]
LectureApr 8Neural Networks and backpropagation -- for named entity recognitionSuggested Readings:
[UFLDL tutorial]
[Learning Representations by Backpropogating Errors]

[slides] [video]
LectureApr 13Project Advice, Neural Networks and Back-Prop (in full gory detail)Suggested Readings:
[Natural Language Processing (almost) from Scratch]
[A Neural Network for Factoid Question Answering over Paragraphs]
[Grounded Compositional Semantics for Finding and Describing Images with Sentences]
[Deep Visual-Semantic Alignments for Generating Image Descriptions]
[Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank]
[(NEW) Lecture Notes 3

[slides] [video]
LectureApr 15Practical tips: gradient checks, overfitting, regularization, activation functions, detailsSuggested Readings:
[Practical recommendations for gradient-based training of deep architectures]
[UFLDL page on gradient checking]

[slides] [video]
A1 DueApr 16Assignment #1 due[Pset 1]
LectureApr 20Recurrent neural networks -- for language modeling and other tasksSuggested Readings:
[Recurrent neural network based language model]
[Extensions of recurrent neural network language model]
[Opinion Mining with Deep Recurrent Neural Networks]
[(NEW) Lecture Notes 4

[slides] [video]
[minimal net example (karpathy)] [vanishing
grad example] [vanishing grad notebook]
Proposal dueApr 21Course Project Proposal due[proposal description]
LectureApr 22GRUs and LSTMs -- for machine translationSuggested Readings:
[Long Short-Term Memory]
[Gated Feedback Recurrent Neural Networks]
[Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling]

[slides] [video]
LectureApr 27Recursive neural networks -- for parsingSuggested Readings:
[Parsing with Compositional Vector Grammars]
[Subgradient Methods for Structured Prediction]
[Parsing Natural Scenes and Natural Language with Recursive Neural Networks]

[slides] [video]
LectureApr 29Recursive neural networks -- for different tasks (e.g. sentiment analysis)Suggested Readings:
[Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank]
[Dynamic Pooling and Unfolding Recursive Autoencoders
for Paraphrase Detection]
[Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks]

[slides] [video]
 
A2 DueApr 30Pset #2 Due date[Pset #2]
LectureMay 4Review Session for MidtermSuggested Readings: N/A
[slides] [video - see Piazza]
MidtermMay 6In-class midterm 
LectureMay 11Guest Lecture with Jason Weston from Facebook: Neural Models with Memory -- for question answeringSuggested Readings:
[Memory Networks]
[Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks]

[slides] [video]
MilestoneMay 13Course Project Milestone[milestone description]
LectureMay 13Convolutional neural networks -- for sentence classificationSuggested Readings:
[A Convolutional Neural Network for Modelling Sentences]

[slides] [video]
LectureMay 18Guest Lecture with Andrew Maas: Speech recognitionSuggested Readings:
[ Deep Neural Networks for Acoustic Modeling in Speech Recognition]

[slides] [video]
LectureMay 20Guest Lecture with Elliot English: Efficient implementations and GPUsSuggested Readings:
[]

[slides] [video]
A3 DueMay 21Pset #3 Due date[Pset #3]
LectureMay 27Applications of Deep Learning to Natural Language ProcessingSuggested Readings:
[]

[slides] [video]
LectureJun 1The future of Deep Learning for NLP: Dynamic Memory NetworksSuggested Readings:
[Ask me anthing: Dynamic Memory Networks for NLP]

[slides] [no video]
Poster PresentationJun 3Final project poster presentations: 2-5 pm, Gates patio 
Final Project DueJun 8Final course project due date[project description]
3. coursera上面的Andrew NG的machine learning(https://www.coursera.org/learn/machine-learning)
以及

Geoffrey Hinton的nerual network used in Machine learning.(https://www.coursera.org/course/neuralnets)
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  教程