• Aucun résultat trouvé

4 Experimental analysis

Dans le document Doctoral Program Proceedings (Page 48-52)

m1 di1

m di

= di

m

It is important to notice that for two different values yj1 and yj2 and for a same domain Di, the events {yj1 ∈ Di} and {yj2 ∈ Di} are not independent, unlike the Erd˝os-Renyi model. However, for two different domainsDi1 andDi2

and for any pair of valueyj1 andyj2 (possibly the same), the events{yj1 ∈Di1} and{yj2 ∈Di2}are independent.

Proposition 6. According to the FDS Model, the number of allowed tuples of alldifferent(X) is expected to be:

EF DS(|SX|) = m!

(m−n)!·mn · Yn

i=1

di (9)

Proof. The proof is very similar to the proof of Proposition 5. ut From Example 1, we have n=m= 5 and the domain size are:d1= 3, d2= 2, d3= 4, d4 = 2 andd5 = 3. We can expect 55!5 ·3·2·4·2·3 = 5,53 solutions.

On this example the expectancy with the Erd˝os-Renyi model is more accurate.

4 Experimental analysis

We first present a qualitative analysis of the different estimators. Then, we adapt the Counting-Based search strategy, presented in [7], so that it is guided by the estimators presented in Section 3 and not only the upper bound from Section 2.

4.1 Qualitative analysis on alldifferent instances

For this qualitative analysis, we have generated randomly and uniformly 10000 instances of alldifferent, which present at least one solution, with n = 10 variables andm= 10 values. We choose randomly a parameter p and we generate an instance according to the Erd˝os-Renyi model. For each of those instances, we have computed the three estimators of the number of solutions and we compare them to the real number of solutions. Figure 2 shows the percentage of instances for different ranges of relative gap between the value of the estimators and the real number of solutions. The relative gap is computed this way :∆= |est−real|real whereestis the value of the estimator and realis the value of the true number of solutions.

0.05- .05–.1 .1–.15 .15–.2 .2–.3 .3–.5 .5–1 1–1.5 1.5–2 2–3 3–5 5+

0 10 20 30 40

Relative gap’s range

%ofinstances

PZQ ER FDS

Fig. 2.Percentage of instances per relative gap for each estimator

We notice that the third estimator, the expectancy according to the FDS model, seems more accurate as, for about 40% of instances, the relative gap is less than or equal to 0.05 and, for less than 10% of instances, the relative gap is superior to 1. The expectancy according to the Erd˝os-Renyi model is less accurate. As for the upper boundU BP ZQ, it is very far from being accurate.

These results can be explained by the fact that the two last estimators corre-spond to the expected number of solutions according to Erd˝os-Renyi model and FDS model, whereas the first estimator is an upper bound. The FDS estima-tor is more accurate than the ER estimaestima-tor, as it considers the distribution of domains size and not only the density of edges. For search strategies, the corre-lation between the estimator and the real number of solutions is more important than the quality of the estimator. In next subsection, we compare the efficiency of Counting-Based search for the three estimators.

4.2 Estimators’ efficiency within Counting-Based Search

We have adapted themaxSDheuristic presented in [7], originally designed with the PZQ estimator, such that the solution densities are computed from the ER estimator and the FDS estimator. We first wanted to compare the efficiency of the three estimators by running the three version of maxSD on two different problems: the Quasigroup Completion problem with holes and the Magic Square problem. Both can be expressed with one or severalalldifferent constraints.

Some known hard instances of those problems have been generated by Pesant et

8 G. Lo Bianco et al.

al. Unfortunately, our implementation of the three versions ofmaxSDis still a bit naive and it takes several hours to run one of those instances. Therefore, we decided to generate randomly easier instances.

Concerning the Quasigroup Completion problem, we did not manage to gen-erate easier non-trivial instances. The gengen-erated instances are solved (or prove to be unsatisfying) during the first or second propagation stage, which is not interesting when comparing search strategies.

As for the Magic Square problem, we randomize the instances this way: given n, the dimension of the problem and cthe number of filled cases in the square at the beginning, we choose randomly uniformlyc cases in the square, that we fill randomly uniformly among the number of possible arrangementsAnc2.

To solve the generated instances, we have implemented each version ofmaxSD in the solver Choco v4.0.6 [8]. Figure 3 shows the evolution of the number of solved instances with the number of backtracks. For these plots, we have ran-domly generated 20 instances of Magic Squares with n= 5 and c = 5 and 20 instances forn= 5 andc= 10. Fig. 3.% solved instances per number of backtracks for different parameters We noticed that maxSD with the ER estimator and the FDS estimator surprisingly behave in a very similar way. For this reason we plot one curve for those two estimators in both charts. We do not have any explanation for this phenomenon yet. Also, it appears like, for these generated instances, maxSD with the two expectancy estimators performs better, especially for the hardest instances (Figure 3a).

5 Conclusion

In this paper, we have presented two probabilistic models for alldifferent and two estimators of the number of solutions. We have adapted the Counting-Based search strategy for those new estimators. We still need to work on their implementation so we can run them on bigger and harder instances. Yet, the results so far are encouraging. We also think to adapt these probabilistic models to other cardinality constraints.

References

1. Boisberranger, J.D., Gardy, D., Lorca, X., Truchet, C.: When is it worthwhile to propagate a constraint? A probabilistic analysis of alldifferent (2013)

2. Bregman, L.M.: Some properties of nonnegative matrices and their permanents.

Soviet Math. Dokl (1973)

3. Erdos, P., Renyi, A.: On random matrices. Publication of the Mathematical Insti-tute of the Hungarian Academy of Science (1963)

4. van Hoeve, W.J.: The alldifferent constraint: A survey. CoRR cs.PL/0105015 (2001)

5. Liang, H., Bai, F.: An upper bound for the permanent of (0,1- matrices). Linear Algebra and its Applications (2004)

6. Lovasz, L., Plummer, M.D.: Matching Theory. American Mathematical Society (2009)

7. Pesant, G., Quimper, C., Zanarini, A.: Counting-based search: Branching heuris-tics for constraint satisfaction problems. J. Artif. Intell. Res.43, 173–210 (2012).

https://doi.org/10.1613/jair.3463, https://doi.org/10.1613/jair.3463

8. Prud’homme, C., Fages, J.G., Lorca, X.: Choco Solver Documentation.

TASC, INRIA Rennes, LINA CNRS UMR 6241, COSLING S.A.S. (2016), http://www.choco-solver.org

9. R´egin, J.: A filtering algorithm for constraints of difference in csps. pp. 362–367 (1994)

10. Valiant, L.G.: The complexity of computing the permanent. Theor. Comput. Sci.

8, 189–201 (1979)

Three New Approaches for the Maximum

Dans le document Doctoral Program Proceedings (Page 48-52)

Documents relatifs