iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://aclanthology.org/2020.findings-emnlp.18/
Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP - ACL Anthology

Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP

Hao Fei, Yafeng Ren, Donghong Ji


Abstract
Syntax has been shown useful for various NLP tasks, while existing work mostly encodes singleton syntactic tree using one hierarchical neural network. In this paper, we investigate a simple and effective method, Knowledge Distillation, to integrate heterogeneous structure knowledge into a unified sequential LSTM encoder. Experimental results on four typical syntax-dependent tasks show that our method outperforms tree encoders by effectively integrating rich heterogeneous structure syntax, meanwhile reducing error propagation, and also outperforms ensemble methods, in terms of both the efficiency and accuracy.
Anthology ID:
2020.findings-emnlp.18
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
183–193
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.18
DOI:
10.18653/v1/2020.findings-emnlp.18
Bibkey:
Cite (ACL):
Hao Fei, Yafeng Ren, and Donghong Ji. 2020. Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 183–193, Online. Association for Computational Linguistics.
Cite (Informal):
Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP (Fei et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.18.pdf
Data
OntoNotes 5.0SST