Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An FPT algorithm and a polynomial kernel for Linear Rankwidth-1 Vertex Deletion (1504.05905v2)

Published 22 Apr 2015 in cs.DS

Abstract: Linear rankwidth is a linearized variant of rankwidth, introduced by Oum and Seymour [Approximating clique-width and branch-width. J. Combin. Theory Ser. B, 96(4):514--528, 2006]. Motivated from recent development on graph modification problems regarding classes of graphs of bounded treewidth or pathwidth, we study the Linear Rankwidth-1 Vertex Deletion problem (shortly, LRW1-Vertex Deletion). In the LRW1-Vertex Deletion problem, given an $n$-vertex graph $G$ and a positive integer $k$, we want to decide whether there is a set of at most $k$ vertices whose removal turns $G$ into a graph of linear rankwidth at most $1$ and find such a vertex set if one exists. While the meta-theorem of Courcelle, Makowsky, and Rotics implies that LRW1-Vertex Deletion can be solved in time $f(k)\cdot n3$ for some function $f$, it is not clear whether this problem allows a running time with a modest exponential function. We first establish that LRW1-Vertex Deletion can be solved in time $8k\cdot n{\mathcal{O}(1)}$. The major obstacle to this end is how to handle a long induced cycle as an obstruction. To fix this issue, we define necklace graphs and investigate their structural properties. Later, we reduce the polynomial factor by refining the trivial branching step based on a cliquewidth expression of a graph, and obtain an algorithm that runs in time $2{\mathcal{O}(k)}\cdot n4$. We also prove that the running time cannot be improved to $2{o(k)}\cdot n{\mathcal{O}(1)}$ under the Exponential Time Hypothesis assumption. Lastly, we show that the LRW1-Vertex Deletion problem admits a polynomial kernel.

Citations (2)

Summary

We haven't generated a summary for this paper yet.