Write a Blog >>
ICSE 2023
Sun 14 - Sat 20 May 2023 Melbourne, Australia
Sun 14 May 2023 11:30 - 12:00 at Meeting Room 110 - Session 2 Chair(s): Reyhaneh Jabbarvand

Although deep neural models substantially reduce the overhead of feature engineering, the features readily available in the inputs might significantly impact training cost and the performance of the models. In this paper, we explore the impact of an unsuperivsed feature enrichment approach based on variable roles on the performance of neural models of code. The notion of variable roles (as introduced in the works of Sajaniemi et al. [Refs. 1,2]) has been found to help students’ abilities in programming. In this paper, we investigate if this notion would improve the performance of neural models of code. To the best of our knowledge, this is the first work to investigate how Sajaniemi et al.’s concept of variable roles can affect neural models of code. In particular, we enrich a source code dataset by adding the role of individual variables in the dataset programs, and thereby conduct a study on the impact of variable role enrichment in training the Code2Seq model. In addition, we shed light on some challenges and opportunities in feature enrichment for neural code intelligence models.

Sun 14 May

Displayed time zone: Hobart change

11:00 - 12:30
Session 2InteNSE at Meeting Room 110
Chair(s): Reyhaneh Jabbarvand University of Illinois at Urbana-Champaign
11:00
30m
Research paper
Study of Distractors in Neural Models of Code
InteNSE
Md Rafiqul Islam Rabin University of Houston, Aftab Hussain University of Houston, Sahil Suneja IBM Research, Amin Alipour University of Houston
Pre-print
11:30
30m
Research paper
A Study of Variable-Role-based Feature Enrichment in Neural Models of Code
InteNSE
Aftab Hussain University of Houston, Md Rafiqul Islam Rabin University of Houston, Bowen Xu North Carolina State University, David Lo Singapore Management University, Amin Alipour University of Houston
Pre-print
12:00
30m
Other
Half Day Wrap Up
InteNSE