Write a Blog >>
ICPC 2022
Mon 16 - Tue 17 May 2022
co-located with ICSE 2022
Sun 15 May 2022 21:37 - 21:44 at ICPC room - Session 1: Summarization Chair(s): Haipeng Cai

Code summarization with deep learning has been widely studied in recent years. Current deep learning models for code summarization generally follow the principle in neural machine translation and adopt the encoder-decoder framework, where the encoder learns the semantic representations from source code and the decoder transforms the learnt representations into human-readable text that describes the functionality of code snippets. Despite they achieve the new state-of-the-art performance, we notice that current models often either generate less fluent summaries, or fail to capture the core functionality, since they usually focus on a single type of code representations. As such we propose GypSum, a new deep learning model that learns hybrid representations using graph attention neural networks and a pre-trained programming and natural language model. We introduce particular edges related to the control flow of a code snippet into the abstract syntax tree for graph construction, and design two encoders to learn from the graph and the token sequence of source code, respectively. We modify the encoder-decoder sublayer in the Transformerโ€™s decoder to fuse the representations and propose a dual-copy mechanism to facilitate summary generation. Experimental results demonstrate the superior performance of GypSum over existing code summarization models.

Sun 15 May

Displayed time zone: Eastern Time (US & Canada) change

21:30 - 22:20
Session 1: SummarizationResearch at ICPC room
Chair(s): Haipeng Cai Washington State University, USA
21:30
7m
Talk
PTM4Tag: Sharpening Tag Recommendation of Stack Overflow with Pre-trained Models
Research
Junda He Singapore Management University, Bowen Xu Singapore Management University, Zhou Yang Singapore Management University, DongGyun Han Singapore Management University, Chengran Yang Singapore Management University, David Lo Singapore Management University
Media Attached
21:37
7m
Talk
GypSum: Learning Hybrid Representations for Code Summarization
Research
Yu Wang School of Data Science and Engineering, East China Normal University, Yu Dong School of Data Science and Engineering, East China Normal University, Xuesong Lu School of Data Science and Engineering, East China Normal University, Aoying Zhou East China Normal University
DOI Pre-print Media Attached
21:44
7m
Talk
M2TS: Multi-Scale Multi-Modal Approach Based on Transformer for Source Code Summarization
Research
Yuexiu Gao Shandong Normal University, Chen Lyu Shandong Normal University
Media Attached
21:51
7m
Talk
Semantic Similarity Metrics for Evaluating Source Code Summarization
Research
Sakib Haque University of Notre Dame, Zachary Eberhart University of Notre Dame, Aakash Bansal University of Notre Dame, Collin McMillan University of Notre Dame
Media Attached
21:58
7m
Talk
LAMNER: Code Comment Generation Using Character Language Model and Named Entity Recognition
Research
Rishab Sharma University of British Columbia, Fuxiang Chen University of British Columbia, Fatemeh Hendijani Fard University of British Columbia
Pre-print Media Attached
22:05
15m
Live Q&A
Q&A-Paper Session 1
Research


Information for Participants
Sun 15 May 2022 21:30 - 22:20 at ICPC room - Session 1: Summarization Chair(s): Haipeng Cai
Info for room ICPC room:

Click here to go to the room on Midspace