Tseitin or not Tseitin? The Impact of CNF Transformations on Feature-Model Analyses
Feature modeling is widely used to systematically model features of variant-rich software systems and their dependencies. By translating feature models into propositional formulas and analyzing them with solvers, a wide range of automated analyses across all phases of the software development process become possible. Most solvers only accept formulas in conjunctive normal form (CNF), so an additional transformation of feature models is often necessary. However, it is unclear whether this transformation has a noticeable impact on analyses. With this artifact, we contribute a fully automated software and a replicatable dataset for comparing three CNF transformations (i.e., distributive, Tseitin, and Plaisted-Greenbaum) on a corpus of 22 real-world feature models.