论文阅读 《Progressive Self-Supervised Attention Learning for Aspect-Level Sentiment Analysis》
2020-04-20 21:50
507 查看
这篇文章也是针对aspect-level SA的一篇文章,其实准确的说更像一篇针对注意力机制的文章,提出了SA模型目前注意力机制的缺陷:过度注意高频词,忽略了低频次,本文用了两个技巧:第一个是迭代的mask注意力权重最大的词,第二个是加入了一个正则化项
相关文章推荐
- 论文阅读《ActiveStereoNet:End-to-End Self-Supervised Learning for Active Stereo Systems》
- 【CV论文阅读】Unsupervised deep embedding for clustering analysis
- 论文阅读:Attention-based Dropout Layer for Weakly Supervised Object Localization
- 情感分析论文阅读之《Aspect Level Sentiment Classification with Deep Memory Network》
- [论文阅读]Attention U-Net: Learning Where to Look for the Pancreas
- 论文阅读笔记(1):A Model of Saliency-Based Visual Attention for Rapid Scene Analysis
- 论文阅读:Two-Phase Learning for Weakly Supervised Object Localization
- 阅读图像显著性检测论文一:A Model of Saliency-Based Visual Attention for Rapid Scene Analysis
- 论文阅读笔记 | (IJCAI 2018 Oral) Collaborative Learning for Weakly Supervised Object Detection
- ResNet: Deep Residual Learning for Image Recognition 论文阅读
- 论文阅读:《LIP: Self-supervised Structure-sensitive Learning and A New Benchmark for Human Parsing》
- 行为识别论文阅读(1)——Deep Progressive Reinforcement Learning for Skeleton-based Action Recognition
- 论文阅读理解 - Multi-Context Attention for Human Pose Estimation
- 论文阅读:Compositional Learning for Human Object Interaction
- 论文阅读Soft Proposal Networks for weekly-supervised object localization
- 图像显著性论文(一)—A Model of saliency Based Visual Attention for Rapid Scene Analysis
- Aspect Specific Sentiment Analysis using Hierarchical Deep Learning (Lakkaraju, 2014)
- (MTL)论文阅读CVPR2019:end-to-end multi-task learning with attention
- 论文阅读理解 - Deep Learning of Binary Hash Codes for Fast Image Retrieval
- 论文阅读学习 - ResNet - Deep Residual Learning for Image Recognition