iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://aclanthology.org/S19-2226/
ZQM at SemEval-2019 Task9: A Single Layer CNN Based on Pre-trained Model for Suggestion Mining - ACL Anthology

ZQM at SemEval-2019 Task9: A Single Layer CNN Based on Pre-trained Model for Suggestion Mining

Qimin Zhou, Zhengxin Zhang, Hao Wu, Linmao Wang


Abstract
This paper describes our system that competed at SemEval 2019 Task 9 - SubTask A: ”Sug- gestion Mining from Online Reviews and Forums”. Our system fuses the convolutional neural network and the latest BERT model to conduct suggestion mining. In our system, the input of convolutional neural network is the embedding vectors which are drawn from the pre-trained BERT model. And to enhance the effectiveness of the whole system, the pre-trained BERT model is fine-tuned by provided datasets before the procedure of embedding vectors extraction. Empirical results show the effectiveness of our model which obtained 9th position out of 34 teams with F1 score equals to 0.715.
Anthology ID:
S19-2226
Volume:
Proceedings of the 13th International Workshop on Semantic Evaluation
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota, USA
Editors:
Jonathan May, Ekaterina Shutova, Aurelie Herbelot, Xiaodan Zhu, Marianna Apidianaki, Saif M. Mohammad
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
1287–1291
Language:
URL:
https://aclanthology.org/S19-2226
DOI:
10.18653/v1/S19-2226
Bibkey:
Cite (ACL):
Qimin Zhou, Zhengxin Zhang, Hao Wu, and Linmao Wang. 2019. ZQM at SemEval-2019 Task9: A Single Layer CNN Based on Pre-trained Model for Suggestion Mining. In Proceedings of the 13th International Workshop on Semantic Evaluation, pages 1287–1291, Minneapolis, Minnesota, USA. Association for Computational Linguistics.
Cite (Informal):
ZQM at SemEval-2019 Task9: A Single Layer CNN Based on Pre-trained Model for Suggestion Mining (Zhou et al., SemEval 2019)
Copy Citation:
PDF:
https://aclanthology.org/S19-2226.pdf