Retraining a BERT Model for Transfer Learning in Requirements Engineering: A Preliminary Study
In recent years, advanced deep learning language models such as BERT, ELMO, ULMFiT and GPT have demonstrated strong performance on many general natural language processing (NLP) tasks. BERT, in particular, has also achieved promising results on some domain-specific tasks, including the requirements classification task. However, in spite of its great potential, BERT is said to underperform on domain specific tasks. In this paper, we present BERT4RE, a BERT-based model retrained on requirements texts, aiming to support a wide range of requirements engineering (RE) tasks, including classifying requirements, detecting language issues, identifying key domain concepts, and establishing requirements traceability links. We demonstrate the transferability of BERT4RE, by fine-tuning it for the task of identifying key domain concepts. Our preliminary study shows that BERT4RE achieved better results than the BERT-based model on the demonstrated RE task.
Fri 19 AugDisplayed time zone: Hobart change
19:00 - 20:10 | Artificial Intelligence for RERE@Next! Papers / Research Papers at Dibbler Chair(s): Rifat Ara Shams CSIRO's Data61 | ||
19:00 30mTalk | Automatic Terminology Extraction and Ranking for Feature Modeling Research Papers Jianzhang Zhang Alibaba Business School, Hangzhou Normal University, Sisi Chen Alibaba Business School, Hangzhou Normal University, Hangzhou, China, Jinping Hua Alibaba Business School, Hangzhou Normal University, Hangzhou, China, Nan Niu University of Cincinnati, Chuang Liu Alibaba Business School, Hangzhou Normal University, Hangzhou, China | ||
19:30 20mTalk | Done is better than perfect: Iterative Adaptation via Multi-grained Requirement Relaxation RE@Next! Papers | ||
19:50 20mTalk | Retraining a BERT Model for Transfer Learning in Requirements Engineering: A Preliminary Study RE@Next! Papers |