On Effectiveness of Further Pre-training on BERT models for Story Point Estimation
CONTEXT: Recent studies on story point estimation used deep learning-based language models. These language models were pre-trained on general corpora. However, using language models further pre-trained with specific corpora might be effective. OBJECTIVE: To examine the effectiveness of further pre-trained language models for the predictive performance of story point estimation. METHOD: Two types of further pre-trained language models, namely, domain-specific and repository-specific models, were compared with off-the-shelf models and Deep-SE. The estimation performance was evaluated with 16 project data. RESULTS: The effectiveness of domain-specific and repository-specific models were limited though they were better than the base model they further pre-trained. CONCLUSION: The effect of further pre-training was small. Large off-the-shelf models might be better to be chosen.
Fri 8 DecDisplayed time zone: Pacific Time (US & Canada) change
14:00 - 15:30 | |||
14:00 30mPaper | The FormAI Dataset: Generative AI in Software Security Through the Lens of Formal Verification PROMISE 2023 Norbert Tihanyi Technology Innovation Institute, Tamas Bisztray University of Oslo, Ridhi Jain Technology Innovation Institute (TII), Abu Dhabi, UAE, Mohamed Amine Ferrag Technology Innovation Institute, Lucas C. Cordeiro The University of Manchester, UK, Vasileios Mavroeidis University of Oslo DOI | ||
14:30 30mPaper | Comparing Word-based and AST-based Models for Design Pattern Recognition PROMISE 2023 Sivajeet Chand Dept. of CSE Chalmers | University of Gothenburg, Sweden, Sushant Kumar Pandey Chalmers and University of Gothenburg, Jennifer Horkoff Chalmers and the University of Gothenburg, Miroslaw Staron University of Gothenburg, Miroslaw Ochodek Poznan University of Technology, Darko Durisic R&D, Volvo Cars, Gothenburg, Sweden DOI | ||
15:00 30mPaper | On Effectiveness of Further Pre-training on BERT models for Story Point Estimation PROMISE 2023 Sousuke Amasaki Okayama Prefectural University DOI |