[ACL2015]A Dependency-Based Neural Network for Relation Classification
2017-12-12 08:55
495 查看
哇哇哇...., 很长时间没有写paperNotes了,从今天开始还是应该每天都有啊,每天更的心情就是踏实而美好
这篇文章提到了两个不同NN的特性,作为重点:我放在博文的前面.
CNN is suitable for capturing the most useful features in a flat structure.
RNN(Recursive) is good at modeling hierarchical structures.
this paper的任务:关系分类
实验数据:SemEval-2010 dataset
根据以往的studies得出以下信息:在关系分类中最有用的两种依存信息分别是(1)最短依存书和(2)依存子树
本文的的目的就是有效地结合两种信息进行关系分类.
![](https://img-blog.csdn.net/20171212165540838?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvYXBwbGVtbA==/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast)
在上图中给出了两个实体对在不同的句子中有着同样的最短依存书树,但是两个实体对的关系是完全不一样的,所以仅凭借最短依存书对实体对判别关系是不合理的,所以借助最短依存书上词的一些依存子树作为特征,此时涉及到一个新的问题,如何融合最短依存树的和依存子树的共同判断实体对的关系.
![](https://img-blog.csdn.net/20171212170140331?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvYXBwbGVtbA==/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast)
我们针对一个子树给出recursive的过程,现在假设最短依存书中的每个词都有子树(如果没有,用cLEAF表示), 词the的embedding表示用xthe表示
![](https://img-blog.csdn.net/20171212171021687?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvYXBwbGVtbA==/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast)
在上图的这个分支中,词the是没有子树的,所以拼接了cLEAF, 表示词与子树的拼接信息用下图表示.
![](https://img-blog.csdn.net/20171212171155509?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvYXBwbGVtbA==/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast)
词the是Sabbath的子树,子树的计算过程如下:
![](https://img-blog.csdn.net/20171212171749923?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvYXBwbGVtbA==/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast)
Sabbath的词向量xSabbath和其子树向量cSabbath拼接如下:
![](https://img-blog.csdn.net/20171212171918289?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvYXBwbGVtbA==/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast)
以此计算下去,得到broken的一个子树的emb1,同理计算另一个分支emb2, 两个分支向量相加,然后和xbroken拼接得到表示broken的最终向量
![](https://img-blog.csdn.net/20171212171217448?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvYXBwbGVtbA==/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast)
最后用CNN 对最短依存子树这个flat structure进行卷积然后分类得出实体间关系
这篇文章提到了两个不同NN的特性,作为重点:我放在博文的前面.
CNN is suitable for capturing the most useful features in a flat structure.
RNN(Recursive) is good at modeling hierarchical structures.
this paper的任务:关系分类
实验数据:SemEval-2010 dataset
根据以往的studies得出以下信息:在关系分类中最有用的两种依存信息分别是(1)最短依存书和(2)依存子树
本文的的目的就是有效地结合两种信息进行关系分类.
在上图中给出了两个实体对在不同的句子中有着同样的最短依存书树,但是两个实体对的关系是完全不一样的,所以仅凭借最短依存书对实体对判别关系是不合理的,所以借助最短依存书上词的一些依存子树作为特征,此时涉及到一个新的问题,如何融合最短依存树的和依存子树的共同判断实体对的关系.
我们针对一个子树给出recursive的过程,现在假设最短依存书中的每个词都有子树(如果没有,用cLEAF表示), 词the的embedding表示用xthe表示
在上图的这个分支中,词the是没有子树的,所以拼接了cLEAF, 表示词与子树的拼接信息用下图表示.
词the是Sabbath的子树,子树的计算过程如下:
Sabbath的词向量xSabbath和其子树向量cSabbath拼接如下:
以此计算下去,得到broken的一个子树的emb1,同理计算另一个分支emb2, 两个分支向量相加,然后和xbroken拼接得到表示broken的最终向量
最后用CNN 对最短依存子树这个flat structure进行卷积然后分类得出实体间关系
相关文章推荐
- [ACL2015] A Dependency-Based Neural Network for Relation Classification
- BaiXiang——【arXi2015】An End-to-End Trainable Neural Network for Image-based Sequence Recognition and
- [COLING2016]Attention-Based Convolutional Neural Network for Semantic Relation Extraction
- [EMNLP2015]Distant supervision for Relation Extraction via Piecewise Convolutional Neural Networks
- 论文阅读(Xiang Bai——【PAMI2017】An End-to-End Trainable Neural Network for Image-based Sequence Recognition and Its Application to Scene Text Recognition)
- An End-to-End Trainable Neural Network for Image-based Sequence Recognition and Its Application to S
- ABC-CNN: An Attention Based Convolutional Neural Network for Visual Question Answering
- Convolutional Neural Network For Sentence Classification<Yoon Kim>解析(二)
- 吴恩达deeplearning.ai课堂作业:Class 1 week4-Deep Neural Network for Image Classification
- Bag-of-Words Based Deep Neural Network for Image Retrieval
- Deep Learning 28:读论文“Multi Column Deep Neural Network for Traffic Sign Classification”-------MCDNN 简单理解
- [ACL2016]Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification
- Assignment | 01-week4 -Deep Neural Network for Image Classification: Application
- Convolutional Neural Network For Sentence Classification<Yoon Kim>解析(三)
- 用matlab训练数字分类的深度神经网络Training a Deep Neural Network for Digit Classification
- Convolutional Neural Network For Sentence Classification
- ”Hierarchical Recurrent Neural Network for Skeleton Based Action Recognition“阅读小结
- 吴恩达 深度学习 1-4 课后作业2 Deep Neural Network for Image Classification: Application
- A new deep convolutional neural network for fast hyperspectral image classification Review
- Convolutional Neural Network For Sentence Classification<Yoon Kim>解析(一)