What You Need is What You Get: Theory of Mind for an LLM-Based Code Understanding AssistantNIER Paper
A growing number of tools have used Large Language Models (LLMs) to support developers’ code understanding. However, developers still face several barriers to using such tools, including challenges in describing their intent in natural language, interpreting the tool outcome, and refining an effective prompt to obtain useful information. In this study, we designed an LLM-based conversational assistant that provides a personalized interaction based on inferred user mental state (e.g., background knowledge and experience). We evaluate the approach in a within-subject study with fourteen novices to capture their perceptions and preferences. Our results provide insights for researchers and tool builders who want to create or improve LLM-based conversational assistants to support novices in code understanding.
Wed 9 OctDisplayed time zone: Arizona change
10:30 - 12:00 | Session 1: Code Understanding and OptimizationResearch Track / New Ideas and Emerging Results Track at Abineau Chair(s): Rosalia Tufano Università della Svizzera Italiana | ||
10:30 15m | Optimizing Decompiler Output by Eliminating Redundant Data Flow in Self-Recursive InliningResearch Track Paper Research Track Runze Zhang , Ying Cao Institute of Information Engineering at Chinese Academy of Sciences; University of Chinese Academy of Sciences, Ruigang Liang Institute of Information Engineering at Chinese Academy of Sciences; University of Chinese Academy of Sciences, Peiwei Hu , Kai Chen Institute of Information Engineering at Chinese Academy of Sciences; University of Chinese Academy of Sciences | ||
10:45 15m | Compilation of Commit Changes within Java Source Code RepositoriesOpen Research ObjectResearch Track Paper Research Track Stefan Schott Heinz Nixdorf Institut, Paderborn University, Wolfram Fischer SAP Security Research, Serena Elisa Ponta SAP Security Research, Jonas Klauke Heinz Nixdorf Institut, Paderborn University, Eric Bodden Pre-print | ||
11:00 15m | Understanding Code Change with Micro-ChangesResearch Track Paper Research Track Lei Chen Tokyo Institute of Technology, Michele Lanza Software Institute - USI, Lugano, Shinpei Hayashi Tokyo Institute of Technology DOI Pre-print Media Attached | ||
11:15 10m | What You Need is What You Get: Theory of Mind for an LLM-Based Code Understanding AssistantNIER Paper New Ideas and Emerging Results Track Pre-print | ||
11:25 15m | Decomposing God Header File via Multi-View Graph ClusteringResearch Track Paper Research Track Pre-print | ||
11:40 10m | How Far Have We Gone in Binary Code Understanding Using Large Language ModelsResearch Track Paper Research Track Xiuwei Shang University of Science and Technology of China, Shaoyin Cheng University of Science and Technology of China, Guoqiang Chen University of Science and Technology of China, Yanming Zhang , Li Hu , Xiao Yu , Gangyang Li , Weiming Zhang University of Science and Technology of China, Nenghai Yu Pre-print |