iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://aclanthology.org/2023.bionlp-1.34
BioNART: A Biomedical Non-AutoRegressive Transformer for Natural Language Generation - ACL Anthology

BioNART: A Biomedical Non-AutoRegressive Transformer for Natural Language Generation

Masaki Asada, Makoto Miwa


Abstract
We propose a novel Biomedical domain-specific Non-AutoRegressive Transformer model for natural language generation: BioNART. Our BioNART is based on an encoder-decoder model, and both encoder and decoder are compatible with widely used BERT architecture, which allows benefiting from publicly available pre-trained biomedical language model checkpoints. We performed additional pre-training and fine-tuned BioNART on biomedical summarization and doctor-patient dialogue tasks. Experimental results show that our BioNART achieves about 94% of the ROUGE score to the pre-trained autoregressive model while realizing an 18 times faster inference speed on the iCliniq dataset.
Anthology ID:
2023.bionlp-1.34
Volume:
The 22nd Workshop on Biomedical Natural Language Processing and BioNLP Shared Tasks
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Dina Demner-fushman, Sophia Ananiadou, Kevin Cohen
Venue:
BioNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
369–376
Language:
URL:
https://aclanthology.org/2023.bionlp-1.34
DOI:
10.18653/v1/2023.bionlp-1.34
Bibkey:
Cite (ACL):
Masaki Asada and Makoto Miwa. 2023. BioNART: A Biomedical Non-AutoRegressive Transformer for Natural Language Generation. In The 22nd Workshop on Biomedical Natural Language Processing and BioNLP Shared Tasks, pages 369–376, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
BioNART: A Biomedical Non-AutoRegressive Transformer for Natural Language Generation (Asada & Miwa, BioNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.bionlp-1.34.pdf