MDEPT: Microservices Design Evaluator and Performance TesterResearch Paper
In microservices-based systems, architects find it hard to reason about the impact of their design decisions on performance before implementing them. Making adequate architectural decisions to meet system quality requirements is a challenging task. While definitions of anti-patterns help to avoid inadequate design decisions, they are context-dependent. A design that is an anti-pattern in one context can be an optimal trade-off in another. Static analysis of software design can identify constructs that conform to anti-patterns. However, it is not suitable for quantifying the extent to which these anti-patterns would affect system performance. Ideally, we should be able to predict the dynamic behaviour of a system before it is implemented - to decide whether our design is adequate for the system’s requirements. However, existing approaches either cannot achieve this because they analyze the design statically, or they require complex and laborious modelling and simulation approaches. To address this challenge, we previously introduced a conceptual solution idea that facilitates rapid evaluation of high-level architectural models by combining both static and dynamic analysis. In this paper, we build upon our previous work and introduce the Microservices Design Evaluator and Performance Tester (MDEPT) approach. Mainly, we formalize modeling specifications for microservices systems, introduce a fully functional toolchain for our approach, and present the evaluation results. Our approach enables architects to easily sketch their design, execute it and measure its performance. It also allows them to make changes to the system and re-evaluate it quickly and easily. This allows architects to quickly and compactly evaluate the impact of their architectural decisions on system performance prior to actual implementation.