您的位置:首页 > 其它

概率语言模型 Probabilistic Language Modeling (三) --- 训练工具汇总

2016-03-31 15:28 946 查看
传统算法

1) BerkeleyLM 是用java写的,号称跟KenLM差不多,内存比srilm小
https://github.com/adampauls/berkeleylm
2)MITLM (The MIT Language Modeling toolkit) 参数优化做的比较好
https://code.google.com/p/mitlm/ or https://github.com/mitlm/mitlm
3)SRILM(The SRI language modeling toolkit) 老牌语言模型工具,SRI(Stanford Research Institute)开发,使用比较广泛,c++版本
http://www.speech.sri.com/projects/srilm/
另外Maximum entropy (MaxEnt) language models is苏pported in the SRILM toolkit
https://phon.ioc.ee/dokuwiki/doku.php?id=people:tanel:srilm-me.en
4)IRSTLM (IRST language modeling toolkit) 意大利TrentoFBK-IRST实验室开发处理较大规模的训练数据,integrated
into Moses (a popular open source Statistical Machine Translation decoder) 。IRSTLM在训练语言模型时采用了划分词典分块训练快速合并的方式,从而在训练大规模语料时取得了优异的性能。IRSTLM 训练语言模型时分以下 5 步: a)在训练语料上统计带词频词汇表;b)按照词频均衡的原则将词汇表划分为若干个子词汇表;c)对各个子词汇表统计 n-gram,这些 n-gram 必须以词汇表中的词汇开头;d)根据第四步的统计结果,建立多个子语言模型;e)把所有的子语言模型融合成最终语言模型
http://hlt-mt.fbk.eu/technologies/irstlm or https://github.com/irstlm-team/irstlm
5)KenLM (Kenneth Heafield's language model toolkit) 最大特点是速度快、占用内存少。号称比SRILM要好一些,支持单机大数据的训练,包括c++和python两个接口
http://kheafield.com/code/kenlm/ or https://github.com/kpu/kenlm
6)Bigfatlm Provides Hadoop training of Kneser-ney language models, written in Java
https://github.com/jhclark/bigfatlm
7)Kylm (Kyoto Language Modeling Toolkit)  written in Java,Output in WFST format for use with WFST decoders
http://www.phontron.com/kylm/  or  https://github.com/neubig/kylm
8) OpenGrm Language modelling toolkit for use with OpenFst, makes and modifies n-gram language models encoded as weighted finite-state transducers (FSTs)
http://opengrm.org/
深度学习

1)RNNLM(Recurrent neural network language model toolkit)
http://rnnlm.org/ or http://www.fit.vutbr.cz/~imikolov/rnnlm/
2)BRNNLM (Bayesian recurrent neural network for language model)
http://chien.cm.nctu.edu.tw/bayesian-recurrent-neural-network-for-language-modeling
3)RWTHLM (RWTH Aachen University Neural Network Language Modeling Toolkit, includes feedforward, recurrent, and long short-term memory neural networks)
http://www-i6.informatik.rwth-aachen.de/web/Software/rwthlm.php
4)Character-Aware Neural Language Models,employs a convolutional neural network (CNN)over characters to use as inputs into an long short-term memory (LSTM)recurrent neural network language model (RNN-LM)
https://github.com/yoonkim/lstm-char-cnn
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: