[ROSE REPORT]: The artifact is being submitted as part of ROSE initiative. The ROSE team reads papers looking for examples of reuse of tools as well as data sets, methodology, innovative statistical methods, etc. Also, for papers comparing one algorithm to another (e.g. in optimization, data mining, and theorem proving work), we search for “stepping stone” reuse; i.e. (a) some new paper has surveyed the related work to declare that some other algorithm from another paper is the prior state of the art; (b) the new paper then run that prior method (as a baseline); (c) new results from new algorithms are then compared against the baseline.
If the new paper reuses something from an older paper (e.g. using code, methodology, etc) we see that the claim that the prior thing was useful for some task and see REPRODUCTION of the claim of the original paper. If the new paper implements their own version of the older idea, then that is a REPLICATION of the claim that some prior method is useful for some task. As part of that work, we report here the following example of REPRODUCTION.
We reported here the reproduction of XGBoost. In 2016, Chen et al. described XGBoost, which is a scalable machine learning tree boosting algorithm. They evaluated XGBoost in a wide range of problems and found XGBoost always give state-of-the-art results. In 2020, Zhao et al. proposed eWarn, an incident prediction tool for online service systems. In eWarn, they implemented XGBoost as the classification model.