Emergent Mind

Kernels for Global Constraints

(1104.2541)
Published Apr 13, 2011 in cs.AI and cs.DS

Abstract

Bessiere et al. (AAAI'08) showed that several intractable global constraints can be efficiently propagated when certain natural problem parameters are small. In particular, the complete propagation of a global constraint is fixed-parameter tractable in k - the number of holes in domains - whenever bound consistency can be enforced in polynomial time; this applies to the global constraints AtMost-NValue and Extended Global Cardinality (EGC). In this paper we extend this line of research and introduce the concept of reduction to a problem kernel, a key concept of parameterized complexity, to the field of global constraints. In particular, we show that the consistency problem for AtMost-NValue constraints admits a linear time reduction to an equivalent instance on O(k2) variables and domain values. This small kernel can be used to speed up the complete propagation of NValue constraints. We contrast this result by showing that the consistency problem for EGC constraints does not admit a reduction to a polynomial problem kernel unless the polynomial hierarchy collapses.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.