ICSE 2024
Fri 12 - Sun 21 April 2024 Lisbon, Portugal
Wed 17 Apr 2024 16:30 - 16:45 at Eugénio de Andrade - Testing: various bug types 1 Chair(s): June Sallou

Testing database management systems (DBMSs) is a complex task. The vagueness and intricacy of the SQL specification make it difficult to model the semantics of queries, making it challenging to test the correctness and performance of DBMSs. Traditional approaches, such as metamorphic testing, require precise modeling of the SQL specification to create different inputs with equivalent semantics. This process can be labor-intensive and error-prone. To address this, we propose Mozi, a framework that finds DBMS bugs via configuration-based equivalent transformation. The key idea behind Mozi is to compare the results of equivalent DBMSs with different configurations, rather than between semantically equivalent queries. The framework involves analyzing the query plan, changing configurations to transform the DBMS to an equivalent one, and re-executing the query to compare the results using various test oracles. For example, detecting differences in query results indicates correctness bugs, while observing faster execution times on the optimization-closed DBMS suggests performance bugs.

We demonstrate the effectiveness of Mozi by evaluating it on four widely used DBMSs, namely MySQL, MariaDB, Clickhouse, and PostgreSQL. In the continuous testing, Mozi totally found 101 previously unknown bugs, including 49 correctness and 52 performance bugs in four DBMSs. Among them, 90 bugs are confirmed and 57 bugs have been fixed. In addition, Mozi can be extended to other DBMS fuzzers for testing various types of bugs. With Mozi, testing DBMSs becomes simpler and more effective, potentially saving time and effort that would otherwise be spent on precisely modeling SQL specifications for testing purposes.

Wed 17 Apr

Displayed time zone: Lisbon change

16:00 - 17:30
Testing: various bug types 1Research Track / Software Engineering in Practice at Eugénio de Andrade
Chair(s): June Sallou Delft University of Technology
16:00
15m
Talk
CERT: Finding Performance Issues in Database Systems Through the Lens of Cardinality Estimation
Research Track
Jinsheng Ba National University of Singapore, Manuel Rigger National University of Singapore
Pre-print
16:15
15m
Talk
Optimistic Prediction of Synchronization-Reversal Data Races
Research Track
Zheng Shi National University of Singapore, Umang Mathur National University of Singapore, Andreas Pavlogiannis Aarhus University
16:30
15m
Talk
Mozi: Discovering DBMS Bugs via Configuration-Based Equivalent Transformation
Research Track
Jie Liang , Zhiyong Wu Tsinghua University, China, Jingzhou Fu School of Software, Tsinghua University, Mingzhe Wang Tsinghua University, Chengnian Sun University of Waterloo, Yu Jiang Tsinghua University
16:45
15m
Talk
FlakeSync: Automatically Repairing Async Flaky Tests
Research Track
Shanto Rahman University of Texas at Austin, August Shi The University of Texas at Austin
17:00
15m
Talk
Testing the Limits: Unusual Text Inputs Generation for Mobile App Crash Detection with Large Language Model
Research Track
Zhe Liu Institute of Software, Chinese Academy of Sciences, Chunyang Chen Technical University of Munich (TUM), Junjie Wang Institute of Software, Chinese Academy of Sciences, Mengzhuo Chen Institute of Software, Chinese Academy of Sciences, Boyu Wu University of Chinese Academy of Sciences, Beijing, China, Zhilin Tian Pennsylvania State University, Yuekai Huang Institute of Software, Chinese Academy of Sciences, Jun Hu Institute of Software, Chinese Academy of Sciences, Qing Wang Institute of Software, Chinese Academy of Sciences
17:15
15m
Talk
AutoConsis: Automatic GUI-driven Data Inconsistency Detection of Mobile Apps
Software Engineering in Practice
Yongxiang Hu Fudan University, Hailiang Jin Meituan Inc., Xuan Wang Fudan University, Jiazhen Gu The Chinese University of Hong Kong, Shiyu Guo Meituan, Chaoyi Chen Meituan, Xin Wang Fudan University, Yangfan Zhou Fudan University