Fri 19 Aug 2022 19:50 - 20:10 at Dibbler - Artificial Intelligence for RE Chair(s): Rifat Ara Shams

In recent years, advanced deep learning language models such as BERT, ELMO, ULMFiT and GPT have demonstrated strong performance on many general natural language processing (NLP) tasks. BERT, in particular, has also achieved promising results on some domain-specific tasks, including the requirements classification task. However, in spite of its great potential, BERT is said to underperform on domain specific tasks. In this paper, we present BERT4RE, a BERT-based model retrained on requirements texts, aiming to support a wide range of requirements engineering (RE) tasks, including classifying requirements, detecting language issues, identifying key domain concepts, and establishing requirements traceability links. We demonstrate the transferability of BERT4RE, by fine-tuning it for the task of identifying key domain concepts. Our preliminary study shows that BERT4RE achieved better results than the BERT-based model on the demonstrated RE task.

Fri 19 Aug

Displayed time zone: Hobart change

19:00 - 20:10
Artificial Intelligence for RERE@Next! Papers / Research Papers at Dibbler
Chair(s): Rifat Ara Shams CSIRO's Data61
Automatic Terminology Extraction and Ranking for Feature Modeling
Research Papers
Jianzhang Zhang Alibaba Business School, Hangzhou Normal University, Sisi Chen Alibaba Business School, Hangzhou Normal University, Hangzhou, China, Jinping Hua Alibaba Business School, Hangzhou Normal University, Hangzhou, China, Nan Niu University of Cincinnati, Chuang Liu Alibaba Business School, Hangzhou Normal University, Hangzhou, China
Done is better than perfect: Iterative Adaptation via Multi-grained Requirement Relaxation
RE@Next! Papers
Jialong Li Waseda University, Japan, Kenji Tei Waseda University
Retraining a BERT Model for Transfer Learning in Requirements Engineering: A Preliminary Study
RE@Next! Papers
Muideen Ajagbe The University of Manchester, Liping Zhao University of Manchester