Automating Performance Testing in CI/CD - Tools Evaluation
Performance testing is a critical component of the modern software development lifecycle. It ensures application stability under typical workloads and resilience during anomalous conditions. Due to its time-consuming nature, performance testing is often integrated into DevOps CI/CD pipelines to accelerate software delivery and reduce testing delays. While prior research has examined the integration of performance testing into CI/CD, the effect of the testing tools themselves on pipeline performance remains largely unexplored.
In this study, we evaluate the impact of five widely-used open-source performance testing tools—Apache Jmeter, Grafana K6, Gatling, Locust, and Artillery—when integrated into a Jenkins-based CI/CD pipeline for a Spring Boot Java application. We designed and executed automated in-pipeline tests using three common load testing methodologies: representative load, maximum load, and spike load. During these tests, we measured both system-level metrics (CPU and RAM usage) and business-level metrics (response time).
To assess the significance of observed differences, we applied two non-parametric statistical methods: the Kruskal-Wallis test and the Mann-Whitney U test with Bonferroni correction. Our analysis confirms that the differences in performance metrics across tools are statistically significant p-values <0.05. Among the tested tools, Grafana K6 consistently demonstrated the highest resource efficiency across all testing scenarios.
This work contributes (i) a comparative analysis of performance testing tools in a CI/CD context, (ii) an automated pipeline framework for load testing with real-time monitoring, and (iii) practical recommendations for selecting the most appropriate tool based on efficiency and usability.
Thu 18 SepDisplayed time zone: Athens change
11:00 - 12:30 | Frameworks and Test AutomationGeneral Track at Atrium C Chair(s): Petra van den Bos University of Twente, The Netherlands | ||
11:00 30mTalk | Introducing CreaTest: a framework for test case generation in itemis CREATE General Track Andrea Bombarda University of Bergamo, Silvia Bonfanti University of Bergamo, Angelo Gargantini University of Bergamo, Nico Pellegrinelli University of Bergamo | ||
11:30 30mTalk | Distributed Critical Test Generation for Cyber-Physical Systems General Track | ||
12:00 30mTalk | Automating Performance Testing in CI/CD - Tools Evaluation General Track | ||