Title: | A Review on Question and Answer System for COVID-19 Literature on Pre-Trained Models |
Authors: | Bavishi, Hilloni Nandy, Debalina |
Keywords: | BERT CORD-19 COVID 19 NLP (Natural Language Modelling) Question Answering System Specter |
Issue Date: | May-2021 |
Publisher: | International Journal of Advanced Research (IJAR) |
Citation: | Hilloni, B. ,Nandy, D. (2021). A Review on Question and Answer System for COVID-19 Literature on Pre-Trained Models. International Journal of Advanced Research (IJAR), 9(05), ISSN: 2320-5407, Article DOI: 10.21474/IJAR01/12836 DOI URL: http://dx.doi.org/10.21474/IJAR01/12836 |
Abstract: | The COVID-19 literature has accelerated at a rapid pace and the Artificial Intelligence community as well as researchers all over the globe has the responsibility to help the medical community. The CORD-19 dataset contains various articles about COVID-19, SARS CoV-2, and related corona viruses. Due to massive size of literature and documents it is difficult to find relevant and accurate pieces of information. There are question answering system using pre-trained models and fine-tuning them using BERT Transformers. BERT is a language model that powerfully learns from tokens and sentence-level training. The variants of BERT like ALBERT, DistilBERT, RoBERTa, SciBERT alongwith BioSentVec can be effective in training the model as they help in improving accuracy and increase the training speed. This will also provide the information on using SPECTER document level relatedness like CORD 19 embeddings for pre-training a Transformer language model. This article will help in building the question answering model to facilitate the research and save the lives of people in the fight against COVID 19. |
URI: | http://10.9.150.37:8080/dspace//handle/atmiyauni/1098 |
ISSN: | 2320-5407 |
Appears in Collections: | 01. Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
721) 62041_Debalina Nandy.pdf | 602.84 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.