A Self-Attentional Neural Architecture for Code Completion with Multi-Task Learning
Code completion, one of the most useful features in the Integrated Development Environments (IDEs), can accelerate software development by suggesting the libraries, APIs, and method names in real-time. Recent studies have shown that statistical language models can improve the performance of code completion tools through learning from large-scale software repositories. However, these models suffer from three major drawbacks: a) The hierarchical structural information of the programs is not fully utilized in the program’s representation; b) In programs, the semantic relationships can be very long. Existing recurrent neural networks based language models are not sufficient to model the long-term dependency. c) Existing approaches perform a specific task in one model, which leads to the underuse of the information from related tasks. To address these challenges, in this paper, we propose a selfattentional neural architecture for code completion with multi-task learning. To utilize the hierarchical structural information of the programs, we present a novel method that considers the path from the predicting node to the root node. To capture the long-term dependency in the input programs, we adopt a self-attentional architecture based network as the base language model. To enable the knowledge sharing between related tasks, we creatively propose a Multi-Task Learning (MTL) framework to learn two related tasks in code completion jointly. Experiments on three real-world datasets demonstrate the effectiveness of our model when compared with state-of-the-art methods.
Tue 14 JulDisplayed time zone: (UTC) Coordinated Universal Time change
08:30 - 09:30 | |||
08:30 12mPaper | A Self-Attentional Neural Architecture for Code Completion with Multi-Task Learning Research Fang Liu Peking University, Ge Li Peking University, Bolin Wei Peking University, Xin Xia Monash University, Zhiyi Fu Peking University, Zhi Jin Peking University Pre-print Media Attached | ||
08:42 12mPaper | Knowledge Transfer in Modern Code Review Research Maria Caulo University of Basilicata, Bin Lin Università della Svizzera italiana (USI), Gabriele Bavota Università della Svizzera italiana, Giuseppe Scanniello University of Basilicata, Michele Lanza Universita della Svizzera italiana (USI) Pre-print Media Attached | ||
08:54 12mPaper | How are Deep Learning Models Similar? An Empirical Study on Clone Analysis of Deep Learning Software Research Xiongfei Wu University of Science and Technology of China, Liangyu Qin University of Science and Technology of China, Bing Yu Kyushu University, Xiaofei Xie Nanyang Technological University, Lei Ma Kyushu University, Yinxing Xue , Yang Liu Nanyang Technological University, Singapore, Jianjun Zhao Kyushu University Media Attached | ||
09:06 12mPaper | Unified Configuration Setting Access in Configuration Management Systems Research Markus Raab Vienna University of Technology, Austria, Bernhard Denner Thales, Stefan Hanenberg University of Duisburg-Essen, Jürgen Cito MIT Media Attached | ||
09:18 12mPaper | Inheritance software metrics on smart contracts ERA Ashish Rajendra Sai University of Limerick, Conor Holmes University of Limerick, Jim Buckley Lero - The Irish Software Research Centre and University of Limerick, Andrew LeGear Horizon Globex Media Attached |