Download PDFOpen PDF in browser

Combining Conflict-Driven Clause Learning and Chronological Backtracking for Propositional Model Counting

14 pagesPublished: December 10, 2019

Abstract

In propositional model counting, also named #SAT, the search space needs to be explored exhaustively, in contrast to SAT, where the task is to determine whether a propositional formula is satisfiable. While state-of-the-art SAT solvers are based on non- chronological backtracking, it has also been shown that backtracking chronologically does not significantly degrade solver performance. Hence investigating the combination of chronological backtracking with conflict-driven clause learning (CDCL) for #SAT seems evident. We present a calculus for #SAT combining chronological backtracking with CDCL and provide a formal proof of its correctness.

Keyphrases: #SAT, chronological backtracking, conflict-driven clause learning, model counting, propositional calculus, rules, SAT

In: Diego Calvanese and Luca Iocchi (editors). GCAI 2019. Proceedings of the 5th Global Conference on Artificial Intelligence, vol 65, pages 113--126

Links:
BibTeX entry
@inproceedings{GCAI2019:Combining_Conflict_Driven_Clause_Learning,
  author    = {Sibylle M\textbackslash{}"ohle and Armin Biere},
  title     = {Combining Conflict-Driven Clause Learning and Chronological Backtracking for Propositional Model Counting},
  booktitle = {GCAI 2019. Proceedings of the 5th Global Conference on Artificial Intelligence},
  editor    = {Diego Calvanese and Luca Iocchi},
  series    = {EPiC Series in Computing},
  volume    = {65},
  pages     = {113--126},
  year      = {2019},
  publisher = {EasyChair},
  bibsource = {EasyChair, https://easychair.org},
  issn      = {2398-7340},
  url       = {https://easychair.org/publications/paper/wBnB},
  doi       = {10.29007/vgg4}}
Download PDFOpen PDF in browser