Agreement on Target-Bidirectional LSTMs for Sequence-to-Sequence Learning
文件大小: 972k
源码售价: 10 个金币 积分规则     积分充值
资源说明:Recurrent neural networks, particularly the long short-term memory networks, are extremely appealing for sequence-tosequence learning tasks. Despite their great success, they typically suffer from a fundamental shortcoming: they are prone to generate unbalanced targets with good prefixes but bad suffixes, and thus performance suffers when dealing with long sequences. We propose a simple yet effective approach to overcome this shortcoming. Our approach relies on the agreement between a pair of target-directional LSTMs, which generates more balanced targets. In addition, we develop two efficient approximate search methods for agreement that are empirically shown to be almost optimal in terms of sequence-level losses. Extensive experiments were performed on two standard sequence-to-sequence transduction tasks: machine transliteration and grapheme-to-phoneme transformation. The results show that the proposed approach achieves consistent and substantial improvements, compared to six state-of-the-art systems. In particular, our approach outperforms the best reported error rates by a margin (up to 9% relative gains) on the grapheme-to-phoneme task. Our toolkit is publicly available on https://github.com/lemaoliu/Agtarbidir.
本源码包内暂不包含可直接显示的源代码文件,请下载源码包。