ROSE: Transformer-Based Refactoring Recommendation for Architectural Smells
This program is tentative and subject to change.
Architectural smells, design flaws such as God Class, Cyclic Dependency, and Hub-like Dependency, erode maintainability and often impair runtime behaviour. While existing detectors flag these issues, they rarely suggest how to remove them. We developed ROSE, a recommender system that turns smell reports into concrete refactoring advice by leveraging pre-trained code transformers. We frame remediation as a three-way classification task (Extract Method, Move Class, Pull Up Method) and fine-tune CodeBERT and CodeT5 on 2.1 million refactoring instances mined with RefactoringMiner from 11,149 open-source Java projects. Running with ten-fold cross-validation, CodeT5 gets 96.9% accuracy and a macro-F1 of 0.95, outperforming CodeBERT by 10 percentage points and all classical baselines reported in the original dataset study. Confusion-matrix analysis shows that both models separate Pull Up Method well, whereas Extract Method remains challenging because of overlap with structurally similar changes. These findings provide the first empirical evidence that transformers can close the gap between architectural-smell detection and actionable repair. The study illustrates the promise, and current limits, of data-driven, architecture-level refactoring, laying the groundwork for richer recommender systems that cover a wider range of smells and languages. We release code, trained checkpoints, and the balanced dataset under an open licence to encourage replication.