iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://doi.org/10.18653/v1/2021.sigdial-1.51
GenSF: Simultaneous Adaptation of Generative Pre-trained Models and Slot Filling - ACL Anthology

GenSF: Simultaneous Adaptation of Generative Pre-trained Models and Slot Filling

Shikib Mehri, Maxine Eskenazi


Abstract
In transfer learning, it is imperative to achieve strong alignment between a pre-trained model and a downstream task. Prior work has done this by proposing task-specific pre-training objectives, which sacrifices the inherent scalability of the transfer learning paradigm. We instead achieve strong alignment by simultaneously modifying both the pre-trained model and the formulation of the downstream task, which is more efficient and preserves the scalability of transfer learning. We present GenSF (Generative Slot Filling), which leverages a generative pre-trained open-domain dialog model for slot filling. GenSF (1) adapts the pre-trained model by incorporating inductive biases about the task and (2) adapts the downstream task by reformulating slot filling to better leverage the pre-trained model’s capabilities. GenSF achieves state-of-the-art results on two slot filling datasets with strong gains in few-shot and zero-shot settings. We achieve a 9 F1 score improvement in zero-shot slot filling. This highlights the value of strong alignment between the pre-trained model and the downstream task.
Anthology ID:
2021.sigdial-1.51
Volume:
Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
July
Year:
2021
Address:
Singapore and Online
Editors:
Haizhou Li, Gina-Anne Levow, Zhou Yu, Chitralekha Gupta, Berrak Sisman, Siqi Cai, David Vandyke, Nina Dethlefs, Yan Wu, Junyi Jessy Li
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
489–498
Language:
URL:
https://aclanthology.org/2021.sigdial-1.51
DOI:
10.18653/v1/2021.sigdial-1.51
Bibkey:
Cite (ACL):
Shikib Mehri and Maxine Eskenazi. 2021. GenSF: Simultaneous Adaptation of Generative Pre-trained Models and Slot Filling. In Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 489–498, Singapore and Online. Association for Computational Linguistics.
Cite (Informal):
GenSF: Simultaneous Adaptation of Generative Pre-trained Models and Slot Filling (Mehri & Eskenazi, SIGDIAL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.sigdial-1.51.pdf
Video:
 https://www.youtube.com/watch?v=PNCr4am-1Gc
Code
 shikib/generative_slot_filling