logo

SCIENTIA SINICA Informationis, Volume 49 , Issue 6 : 685-697(2019) https://doi.org/10.1360/N112017-00248

Partial maximum satisfiability problem method combined with structure characteristics for diagnostic problems

More info
  • ReceivedJul 23, 2018
  • AcceptedNov 1, 2018
  • PublishedJun 6, 2019

Abstract


Funded by

国家自然科学基金项目(61872159,61672261,61502199)


References

[1] Li C M, Manya F, Planes J. New Inference Rules for Max-SAT. jair, 2007, 30: 321-359 CrossRef Google Scholar

[2] Lin H, Su K, Li C M. Within-problem learning for efficient lower bound computation in Max-SAT solving. In: Proceedings of the 23rd AAAI Conference of Artificial Intelligence, Chicago, 2008. 351--356. Google Scholar

[3] Heras F, Larrosa J, Oliveras A. MiniMaxSAT: An Efficient Weighted Max-SAT solver. jair, 2008, 31: 1-32 CrossRef Google Scholar

[4] Davies J, Cho J, Bacchus F. Using learnt clauses in MAXSAT. In: Proceedings of the 16th International Conference of Principles and Practice of Constraint Programming - CP 2010, St. Andrews, 2010. 176--190. Google Scholar

[5] Li C M, Manyà F, Mohamedou N, et al. Exploiting cycle structures in Max-SAT. In: Proceedings of the 12th International Conference on Theory and Applications of Satisfiability Testing, Swansea, 2009. 467--480. Google Scholar

[6] Ansótegui C, Bonet M L, Levy J. SAT-based MaxSAT algorithms. Artificial Intelligence, 2013, 196: 77-105 CrossRef Google Scholar

[7] Martins R, Manquinho V, Lynce I. Community-based partitioning for MaxSAT solving. In: Proceeding of the 16th International Conference of Theory and Applications of Satisfiability Testing (SAT 2013), Helsinki, 2013. 182--191. Google Scholar

[8] Narodytska N, Bacchus F. Maximum satisfiability using core-guided MAXSAT resolution. In: Proceeding of the 28th AAAI Conference on Artificial Intelligence, Québec City, 2014. 2717--2723. Google Scholar

[9] Fu Z, Malik S. On solving the partial MAX-SAT problem. In: Proceeding of the 9th International Conference of Theory and Applications of Satisfiability Testing, Seattle, 2006. 252--265. Google Scholar

[10] Davies J, Bacchus F. Solving MAXSAT by solving a sequence of simpler SAT instances. In: Proceeding of the 17th International Conference of Principles and Practice of Constraint Programming (CP 2011), Perugia, 2011. 225--239. Google Scholar

[11] Koshimura M, Zhang T, Fujita H, et al. QMaxSAT: A partial Max-SAT solver. J Satisf Boolean Model, 2012, 8: 95--100. Google Scholar

[12] Cai S W, Luo C, Zhang H. From decimation to local search and back: A new approach to MaxSAT. In: Proceeding of the 26th International Joint Conference on Atificial Intelligence, Melbourne, 2017. 571--577. Google Scholar

[13] Morgado A, Heras F, Liffiton M. Iterative and core-guided MaxSAT solving: A survey and assessment. Constraints, 2013, 18: 478-534 CrossRef Google Scholar

[14] Luo C, Cai S W, Wu W. CCLS: An Efficient Local Search Algorithm for Weighted Maximum Satisfiability. IEEE Trans Comput, 2015, 64: 1830-1843 CrossRef Google Scholar

[15] Cai S W, Luo C, Lin J K. New local search methods for partial MaxSAT. Artificial Intelligence, 2016, 240: 1-18 CrossRef Google Scholar

[16] Robinson J A, Davis M, Logemann G, et al. A machine program for theorem-proving. J Symb Log, 1967, 32: 118. Google Scholar

[17] Biere A, Heule M, Maaren H V, et al. Handbook of Satisfiability. Netherlands: IOS Press, 2009. 99--130. Google Scholar

[18] Reiter R. A theory of diagnosis from first principles. Artificial Intelligence, 1987, 32: 57-95 CrossRef Google Scholar

[19] Chieu H L, Lee W S. Relaxed Survey Propagation for The Weighted Maximum Satisfiability Problem. jair, 2009, 36: 229-266 CrossRef Google Scholar

[20] Wang Y, Cai S, Yin M. Two efficient local search algorithms for maximum weight clique problem. In: Proceeding of Proceedings of the 30th AAAI Conference on Atificial Intelligence, Phoenix, 2016. 805--811. Google Scholar

  • Figure 1

    Boolean circuit

  • Figure 2

    (Color online) Number of soft blocking variables for 258 test instances

  • Figure 3

    (Color online) Comparison of SCPMS algorithms with different stop mechanisms and different flip variables and solutions of soft blocking vars

  • Figure 4

    (Color online) Comparison of DeciDist and SCPMS algorithms with different stop mechanisms and different flip times

  • Table 1   Solution of SCPMS algorithm
    Benchmark $\#$ Inst DistUP #Win. DeciDist #Win. SCPMS #Win. WPM3 #Win.
    b14 26 0 0 18 12
    b15 6 0 0 1 6
    b17 26 0 0 8 26
    b20 64 0 0 18 58
    b21 120 0 0 21 115
    b22 55 0 0 8 55
  •   

    Algorithm 1 Decimation($F$)

    Input:A CNF formula $F$;

    Output:an assignment of variables in $F$;

    while $\exists$ unassigned variables do

    if $\exists$ unit clause then

    if $\exists$ unit clause then

    Hard_unit_clause_Propagation();

    else

    if $\exists$ soft unit clause then

    Soft_unit_clause_Propagation();

    end if

    end if

    else

    ($\nexists$ unit clause)

    $v$ $\leftarrow$ a randomly unassigned vars;

    Assign($v$);

    end if

    end while

  • Table 2   Solution of soft blocking vars
    Benchmark $\#$ Inst. Maximum Minimum Mean
    b14 26 33 20 25
    b15 6 37 28 30
    b17 26 40 21 38
    b20 64 39 22 32
    b21 120 39 21 27
    b22 55 40 22 33
  •   

    Algorithm 2 DeciDist: a local search algorithm

    Input:PMS instance $F$, cutoff, wp, $t$, and sp;

    Output:A feasible assignment $\alpha$ of $F$, or “no feasible assignment found";

    $\alpha\ast$ $\leftarrow~$ Decimation($F$);

    cost$\ast$ $\leftarrow~$ $+\propto$;

    while elapsed time $<~$ cutoff do

    if $\nexists$ falsified hard clauses $\&~$ cost($\alpha$) $<~$ cost$\ast$ then

    $\alpha~\ast~\leftarrow~\alpha$;

    cost$\ast$ $\leftarrow~$ cost($\alpha$);

    end if

    if $~H$ $\leftarrow~$ $\{$ $x$ $\|$ hscore($x$) $>~$ 0 $\}$ $\ne~$ $\emptyset$ then

    $\alpha\ast$ $\leftarrow~$ $\alpha$;

    cost$\ast$ $\leftarrow~$ cost($\alpha$);

    else

    if $S$ $\leftarrow~$ $\{$ $x$ $\|$ hscore($x$) = 0 $\&~$ sscore($x$)$>~$ 0 $\}$ $\ne~$ $\emptyset$ then

    $V$ $\leftarrow~$ a variable in $S$ with the greatest sscore, breaking ties randomly;

    end if

    else

    Update weights of hard clauses according to HPAWS;

    end if

    if $\exists$ falsified hard clauses then

    $c$ $\leftarrow$ a random falsified hard clause;

    else

    $c$ $\leftarrow~$ a random falsified soft clause;

    end if

    if with probability wp then

    $v$ $\leftarrow~$ a random variable in $c$;

    else

    $v$ $\leftarrow~$ a variable in c with the greatest sscore;

    $\alpha$ $\leftarrow~$ $\alpha$ with $v$ flipped;

    end if

    if cost$\ast$ $\le$ SUMWeight(soft clause) then

    return (cost$\ast$, $\alpha\ast$);

    else

    return “no feasible assignment found";

    end if

    end while

  •   

    Algorithm 3 CSV($F$)

    Input:a CNF formula $F$;

    Output:A choosedSVQueue;

    while DPLL($F$) has a conflict do

    New CNF formula $F$ $\leftarrow~$ chooseHClausedel(CNF, DelHClause);

    end while

    Vars $\leftarrow~$ DPLL($F$);

    choosedHVQueue $\leftarrow~$ Compare(DelHClause, Vars);

    while choosedHVQueue is not NULL do

    $V$ $\leftarrow~$ choosedHVQueue.pop();

    sv $\leftarrow~$ ChooseSClauseVar($v$);

    choosedSVQueue(sv);

    end while

    Return a choosedSVQueue;

  •   

    Algorithm 4 SCPMS($F$)

    Input: a CNF formula $F$;

    Output:An assignment of variables;

    $Q$ $\leftarrow~$ CSV($F$);

    while $Q$ is not NULL do

    var $\leftarrow~$ $Q$.pop();

    Filp(var);

    end while

    while unpropagated Hvar do

    UnitPropagate(Hvar);

    end while

    while unpropagated Svar do

    UnitPropagate(Svar);

    end while

    while $\exists$ a var unassigned do

    Assign var with a random number from 0,1;

    end while

    LocalSearch();