Skip to main content
Article thumbnail
Location of Repository

Confluence in Data Reduction: Bridging Graph Transformation and Kernelization

By Hartmut Ehrig, Claudia Ermel, Falk Hüffner, Rolf Niedermeier and Olga Runge

Abstract

Kernelization is a core tool of parameterized algorithmics for coping with computationally intractable problems. A kernelization reduces in polynomial time an input instance to an equivalent instance whose size is bounded by a function only depending on some problemspecific parameter k; this new instance is called problem kernel. Typically, problem kernels are achieved by performing efficient data reduction rules. So far, there was little study in the literature concerning the mutual interaction of data reduction rules, in particular whether data reduction rules for a specific problem always lead to the same reduced instance, no matter in which order the rules are applied. This corresponds to the concept of confluence from the theory of rewriting systems. We argue that it is valuable to study whether a kernelization is confluent, using the NP-hard graph problems (Edge) Clique Cover and Partial Clique Cover as running examples. We apply the concept of critical pair analysis from graph transformation theory, supported by the AGG software tool. These results support the main goal of our work, namely, to establish a fruitful link between (parameterized) algorithmics and graph transformation theory, two so far unrelated fields

Year: 2013
OAI identifier: oai:CiteSeerX.psu:10.1.1.352.3706
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.user.tu-berlin.de/h... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.