AST 2023
Mon 15 - Tue 16 May 2023 Melbourne, Australia
co-located with ICSE 2023
Mon 15 May 2023 14:30 - 14:52 at Meeting Room 107 - Metrics and Benchmarks

The rising popularity and adoption of source-code management systems in combination with Continuous Integration and Continuous Delivery processes have contributed to the adoption of agile software development with short release and feedback cycles between software producers and their customers. DevOps platforms streamline and enhance automation around source-code management systems by providing a uniform interface for managing all the aspects of the software development lifecycle starting from software development until software deployment and by integrating and orchestrating various tools that provide automation around software development processes such as automated bug detection, security testing, dependency scanning, etc.. Applying changes to the DevOps platform or to one of the integrated tools without providing data regarding its real world impact increases the risk of having to remove/revert the change. This could lead to service disruption or loss of confidence in the platform if it does not perform as expected. In addition, integrating alpha or beta features, which may not meet the robustness of a finalised feature, may pose security or stability risks to the entire platform. Hence, short release cycles require testing and benchmarking approaches that make it possible to prototype, test, and benchmark ideas quickly and at scale to support Data-Driven Decision Making, with respect to the features that are about to be integrated into the platform. In this paper, we present a scalable testing and benchmarking approach called SourceWarp that is targeted towards DevOps platforms and supports both testing and benchmarking in a cost effective and reproducible manner. We present a real-world industrial case-study for which we successfully applied SourceWarp to test and benchmark a newly developed feature at GitLab and which has been successfully integrated into the product. In the case study we demonstrate that SourceWarp is scalable and highly effective in supporting agile Data-Driven Decision Making by providing automation for testing and benchmarking proof-of-concept ideas and tools in the DevOps context.

Mon 15 May

Displayed time zone: Hobart change

13:45 - 15:15
Metrics and BenchmarksAST 2023 at Meeting Room 107
13:45
22m
Talk
AutoMetric: Towards Measuring Open-Source Software Quality Metrics Automatically
AST 2023
Taejun Lee Korea University, Heewon Park Korea University, Heejo Lee Korea University
14:07
22m
Talk
Learning to Learn to Predict Performance Regressions in Production at Meta
AST 2023
Moritz Beller Meta Platforms, Inc., USA, Hongyu Li Liquido, Vivek Nair Meta Platforms, Inc., Vijayaraghavan Murali Meta Platforms, Inc., Imad Ahmad Meta Platforms, Inc., Jürgen Cito TU Wien, Drew Carlson Ex-Meta Platforms, Inc., Gareth Ari Aye Meta Platforms, Inc., Wes Dyer Meta Platforms, Inc.
Pre-print
14:30
22m
Talk
SourceWarp: A scalable, SCM-driven testing and benchmarking approach to support data-driven and agile decision making for CI/CD tools and DevOps platforms
AST 2023
Julian Thome GitLab Inc., James Johnson --, Isaac Dawson GitLab Inc., Dinesh Bolkensteyn GitLab Inc., Michael Henriksen GitLab Inc., Mark Art GitLab Inc.
14:52
22m
Talk
Structural Test Input Generation for 3-Address Code Coverage Using Path-Merged Symbolic Execution
AST 2023
Soha Hussein University of Minnesota, USA / Ain Shams University, Egypt, Stephen McCamant University of Minnesota, USA, Elena Sherman Boise State University, Vaibhav Sharma Amazon, Michael Whalen Amazon Web Services and the University of Minnesota