iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://aclanthology.org/2022.findings-emnlp.164
Exploiting Labeled and Unlabeled Data via Transformer Fine-tuning for Peer-Review Score Prediction - ACL Anthology

Exploiting Labeled and Unlabeled Data via Transformer Fine-tuning for Peer-Review Score Prediction

Panitan Muangkammuen, Fumiyo Fukumoto, Jiyi Li, Yoshimi Suzuki


Abstract
Automatic Peer-review Aspect Score Prediction (PASP) of academic papers can be a helpful assistant tool for both reviewers and authors. Most existing works on PASP utilize supervised learning techniques. However, the limited number of peer-review data deteriorates the performance of PASP. This paper presents a novel semi-supervised learning (SSL) method that incorporates the Transformer fine-tuning into the Γ-model, a variant of the Ladder network, to leverage contextual features from unlabeled data. Backpropagation simultaneously minimizes the sum of supervised and unsupervised cost functions, avoiding the need for layer-wise pre-training. The experimental results show that our model outperforms the supervised and naive semi-supervised learning baselines. Our source codes are available online.
Anthology ID:
2022.findings-emnlp.164
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2233–2240
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.164
DOI:
10.18653/v1/2022.findings-emnlp.164
Bibkey:
Cite (ACL):
Panitan Muangkammuen, Fumiyo Fukumoto, Jiyi Li, and Yoshimi Suzuki. 2022. Exploiting Labeled and Unlabeled Data via Transformer Fine-tuning for Peer-Review Score Prediction. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 2233–2240, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Exploiting Labeled and Unlabeled Data via Transformer Fine-tuning for Peer-Review Score Prediction (Muangkammuen et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.164.pdf
Software:
 2022.findings-emnlp.164.software.zip
Video:
 https://aclanthology.org/2022.findings-emnlp.164.mp4