DEV Community

Cover image for The Salvador Algorithm
Frank Vega
Frank Vega

Posted on

The Salvador Algorithm

An Approximate Solution to the Minimum Vertex Cover Problem: The Salvador Algorithm

Frank Vega

Information Physics Institute, 840 W 67th St, Hialeah, FL 33012, USA

vega.frank@gmail.com


Abstract

The Minimum Vertex Cover problem is NP-hard, and the best known polynomial-time approximation ratio is 22 , matching the Unique Games Conjecture lower bound of 2ε2-\varepsilon for every ε>0\varepsilon>0 . Without the Unique Games Conjecture, the problem remains hard to approximate within a factor of 2ε\sqrt{2}-\varepsilon for any constant ε>0\varepsilon>0 unless P=NP\mathrm{P}=\mathrm{NP} , as established by the 2-to-2 games hardness results. We present a polynomial-time algorithm that reduces any input graph to a planar kernel via a weighted Minimum Independent Dominating Set gadget and then applies Baker's PTAS, running in O(mn)O(mn) time — O(n+m)O(n+m) for planar inputs. The output is provably within a factor (1+ε)(1+\varepsilon) of the optimum on planar graphs; the default parameter ε=0.1\varepsilon=0.1 yields a 1.11.1 -approximation guarantee on planar inputs. We conjecture that the approximation ratio is universally bounded by 1.41.4 . Crucially, 1.4=2ε01.4=\sqrt{2}-\varepsilon_0 for the explicit positive constant ε0=2750.0142\varepsilon_0=\sqrt{2}-\tfrac{7}{5}\approx 0.0142 , so a proof of this conjecture would place the algorithm strictly below the 2\sqrt{2} inapproximability threshold and imply P=NP\mathrm{P}=\mathrm{NP} directly and unconditionally. We provide an open-source implementation, Salvador v0.0.1.


Introduction

The Minimum Vertex Cover (VC) problem is one of the twenty-one NP-complete problems identified by Karp. Given an undirected graph G=(V,E)G=(V,E) , a set CVC\subseteq V is a vertex cover if every edge of EE has at least one endpoint in CC . The goal is to find a minimum-cardinality vertex cover, denoted τ(G)\tau(G) .

The approximation landscape for VC is unusually well charted. The classical maximal-matching algorithm achieves a 22 -approximation in linear time; LP-based refinements by Karakostas reach factor 2Θ(1/logn)2-\Theta(1/\sqrt{\log n}) , which is 2o(1)2-o(1) but falls short of a constant 2ε2-\varepsilon . From the hardness side, Dinur and Safra ruled out any ratio below 105211.360610\sqrt{5}-21\approx 1.3606 under PNP\mathrm{P}\neq\mathrm{NP} ; subsequently, Khot, Minzer, and Safra proved the 2-to-2 games conjecture, strengthening this barrier to 2ε\sqrt{2}-\varepsilon for any fixed ε>0\varepsilon>0 under PNP\mathrm{P}\neq\mathrm{NP} ; and under the Unique Games Conjecture, no constant ratio below 2ε2-\varepsilon is achievable. A polynomial-time algorithm achieving a constant ratio ρ<2\rho<\sqrt{2} would therefore resolve P versus NP.

Against this backdrop, we present an algorithm — implemented as the Salvador Python package (v0.0.1) — that, on every instance we have tested, produces a vertex cover of size strictly below 1.141.14 times the known optimum. The algorithm, called find_vertex_cover, operates in three phases:

  1. Preprocessing. Self-loops are discarded; isolated vertices are removed (they appear in no edge and need not be covered).
  2. Reduction and kernel solve. The core routine solve_vc reduces the cleaned graph GG to a weighted MIDS instance on a planar gadget HH and invokes Baker's PTAS on HH with approximation parameter ε\varepsilon . For non-planar GG , a maximal planar subgraph is first extracted and the planar solve is followed by a greedy repair step that covers any edges excluded during planarisation.
  3. Linear-time pruning. prune_redundant_vertices makes a single forward pass over the candidate cover CC and removes every vertex all of whose neighbours are already covered, reducing C|C| without compromising validity.

The dominant cost is the maximal planar subgraph computation for non-planar inputs, giving an overall complexity of O(mn)O(mn) , where n=Vn=|V| and m=Em=|E| . On planar inputs the algorithm runs in O(n+m)O(n+m) .

The main contributions of this paper are:

  • A practical O(mn)O(mn) vertex-cover algorithm with a provable (1+ε)(1+\varepsilon) -approximation guarantee for planar graphs and default ε=0.1\varepsilon=0.1 yielding a 1.11.1 guarantee.
  • A weighted MIDS gadget that encodes vertex cover as a minimum-weight independent dominating set on a planar graph, enabling direct use of Baker's PTAS.
  • An experimental study on thirteen DIMACS benchmark graphs showing empirical ratios in [1.011,1.138][1.011,\,1.138] (mean 1.0491.049 ).
  • A conjecture that the universal approximation ratio is at most 1.4=2ε01.4=\sqrt{2}-\varepsilon_0 for the explicit constant ε0=2750.0142>0\varepsilon_0=\sqrt{2}-\tfrac{7}{5}\approx 0.0142>0 . Since 1.4<21.4<\sqrt{2} , any proof of this conjecture places the algorithm strictly below the 2\sqrt{2} inapproximability barrier established by the 2-to-2 games hardness results, directly implying P=NP\mathrm{P}=\mathrm{NP} .
  • An open-source implementation, Salvador v0.0.1.

Algorithm Description

The core algorithm is implemented in algorithm.py. The weighted MIDS gadget construction and the maximal planar subgraph extraction are implemented in vc_reduction.py, and Baker's PTAS for weighted planar independent dominating sets is implemented in baker_ptas.py.

1. Main entry point (algorithm.py)

import networkx as nx
from . import vc_reduction

def prune_redundant_vertices(adj, C):
    """
    Linear-time single-pass removal of redundant vertices.
    For every v in C we check (in O(deg(v)) time) whether all its neighbors
    are still in the current cover. If yes, we safely remove it immediately.
    Total time across all calls remains O(n + m).
    """
    C = set(C)
    for v in list(C):
        all_neighbors_covered = True
        for u in adj.get(v, []):
            if u not in C:
                all_neighbors_covered = False
                break
        if all_neighbors_covered:
            C.remove(v)
    return C


def find_vertex_cover(graph, epsilon: float = 0.1):
    G = graph.copy()
    G.remove_edges_from(nx.selfloop_edges(G))
    G.remove_nodes_from(list(nx.isolates(G)))

    if G.number_of_edges() == 0:
        return set()

    # Minimum Weighted IDS Reduction (always-planar gadget) → Baker's PTAS
    cover, _ = vc_reduction.solve_vc(G, epsilon)

    # Final pruning on final candidate (still linear)
    adj = {v: set(G[v]) for v in G}
    cover_prune = prune_redundant_vertices(adj, cover)

    return cover_prune
Enter fullscreen mode Exit fullscreen mode

2. Weighted MIDS gadget and planar solver (vc_reduction.py)

The reduction builds a planar gadget HH from a planar input graph GG with NN vertices and hub penalty P=N+1P=N+1 :

  • For each vertex vVv\in V : add nodes (v,0)(v,0) with weight 11 and (v,1)(v,1) with weight 00 , connected by a variable edge.
  • For each edge (u,v)E(u,v)\in E : add hub huvh_{uv} with weight PP , adjacent to (u,0)(u,0) and (v,0)(v,0) .
H is always planar for planar G: it is a subdivision of G augmented with pendant leaves.H\text{ is always planar for planar }G\text{: it is a subdivision of }G\text{ augmented with pendant leaves.}

def reduce_vc_to_mids(G: nx.Graph, epsilon: float = 0.1):
    """Build the always-planar weighted MIDS gadget for a PLANAR graph G."""
    if not nx.is_planar(G):
        raise ValueError("reduce_vc_to_mids requires a planar graph.")

    N = G.number_of_nodes()
    P = N + 1

    H = nx.Graph(); weights = {}
    for v in G.nodes():
        H.add_edge((v, 0), (v, 1))
        weights[(v, 0)] = 1
        weights[(v, 1)] = 0
    for u, v in G.edges():
        hub = ('h', u, v); weights[hub] = P
        H.add_edge((u, 0), hub); H.add_edge((v, 0), hub)

    assert nx.is_planar(H), "Bug: gadget non-planar for planar G."
    return H, weights, P
Enter fullscreen mode Exit fullscreen mode

The hub penalty P=N+1P=N+1 enforces that the optimal MIDS never includes any hub: including kk hubs costs k(N+1)>Nτk(N+1)>N\geq\tau^{\ast} , so it is always cheaper to cover edges through vertex nodes (u,0)(u,0) and (v,0)(v,0) . The key cost identity is:

weighted MIDS cost=cover+(N+1)uncovered edges.\text{weighted MIDS cost} = |\text{cover}| + (N+1)\cdot|\text{uncovered edges}|.

For non-planar graphs, _maximal_planar_subgraph extracts a maximal planar subgraph using a Union-Find spanning forest and a hybrid face-check strategy (Whitney's theorem for face-sharing edges, one check_planarity call for uncertain edges):

def _maximal_planar_subgraph(G: nx.Graph):
    """Extract a maximal planar subgraph of G. Complexity: O(M·N)."""
    H = nx.Graph()
    H.add_nodes_from(G.nodes())

    _par = {v: v for v in G.nodes()}
    _rnk = {v: 0  for v in G.nodes()}

    def _find(x):
        while _par[x] != x:
            _par[x] = _par[_par[x]]
            x = _par[x]
        return x

    def _union(x, y):
        px, py = _find(x), _find(y)
        if px == py:
            return
        if _rnk[px] < _rnk[py]:
            px, py = py, px
        _par[py] = px
        if _rnk[px] == _rnk[py]:
            _rnk[px] += 1

    for u, v in nx.minimum_spanning_edges(G, data=False, ignore_nan=True):
        H.add_edge(u, v)
        _union(u, v)

    _, embedding = nx.check_planarity(H)

    removed = []
    for u, v in G.edges():
        if H.has_edge(u, v):
            continue
        if _find(u) != _find(v):
            H.add_edge(u, v)
            _union(u, v)
            embedding.connect_components(u, v)
        else:
            shared, a, c = _find_shared_face(embedding, u, v)
            if shared:
                H.add_edge(u, v)
                _insert_edge_into_embedding(embedding, u, v, a, c)
            else:
                H.add_edge(u, v)
                is_p, new_emb = nx.check_planarity(H)
                if is_p:
                    embedding = new_emb
                    _union(u, v)
                else:
                    H.remove_edge(u, v)
                    removed.append((u, v))

    return H, removed
Enter fullscreen mode Exit fullscreen mode

The complete public solver handles both planar and non-planar inputs:

def solve_vc(G: nx.Graph, epsilon: float = 0.1):
    """Approximate Minimum Vertex Cover for ANY graph G."""
    if G.number_of_edges() == 0:
        return frozenset(), 0.0

    if nx.is_planar(G):
        cover  = _solve_planar(G, G, epsilon)
        mids_w = float(len(cover))
        return frozenset(cover), mids_w
    else:
        G_p, removed = _maximal_planar_subgraph(G)
        cover = _solve_planar(G_p, G, epsilon)
        for u, v in removed:
            if u not in cover and v not in cover:
                cover.add(u if G.degree(u) >= G.degree(v) else v)
        return frozenset(cover), 0.0
Enter fullscreen mode Exit fullscreen mode

3. Baker's PTAS (baker_ptas.py)

For a planar graph, Baker's PTAS with k=1/εk=\lceil 1/\varepsilon\rceil returns a (1+ε)(1+\varepsilon) -approximation for the weighted MIDS. We use ε=0.1\varepsilon=0.1 by default, so k=10k=10 and the ratio becomes 1.11.1 . The implementation includes BFS layering, tree-decomposition DP with LRU-cached state generation, and bitmask adjacency for efficient subset enumeration.

Key guarantee:

cost(SPTAS)k+1kτH=(1+ε)τH.\text{cost}(S_{\text{PTAS}}) \le \frac{k+1}{k}\,\tau^{\ast}_H = (1+\varepsilon)\,\tau^{\ast}_H.

With ε=0.1\varepsilon=0.1 and k=10k=10 this is factor 1.11.1 on the gadget, which translates to factor 1.11.1 on the vertex cover (Lemma 1 below).


Correctness

Theorem 1 (Correctness). Algorithm find_vertex_cover always returns a valid vertex cover of the input graph GG .

Proof sketch. Isolated vertices are removed in preprocessing and appear in no edge. For the planar path, Baker's PTAS returns a feasible MIDS SS of the gadget HH . The variable edge (v,0),(v,1){(v,0),(v,1)} guarantees at least one of (v,0),(v,1)(v,0),(v,1) is in SS for every vv . For any edge (u,v)E(u,v)\in E , hub huvh_{uv} must be dominated; since huvh_{uv} is adjacent only to (u,0)(u,0) and (v,0)(v,0) , at least one is in SS or the hub itself is, covering the edge after decoding. The repair loop then adds one endpoint of every residual uncovered edge. For the non-planar path, the recursive call covers all edges of GpG_p , and the repair loop explicitly handles every edge in EEpE\setminus E_p . Pruning only removes a vertex vCv\in C when all neighbours of vv lie in CC , so every edge incident to vv remains covered by the neighbour side. Hence the output is always a valid vertex cover. ∎


Approximation Ratio and Consequences for P vs NP

Structural setup

Let G=(V,E)G=(V,E) be the input graph with n=Vn=|V| and m=Em=|E| . Write τ=τ(G)\tau^{\ast}=\tau(G) for the minimum vertex cover size. Let HH be the weighted MIDS gadget and τH\tau^{\ast}_H its minimum weighted MIDS cost. For non-planar inputs, let Gp=(V,Ep)G_p=(V,E_p) be the maximal planar subgraph, with τp=τ(Gp)\tau_p^{\ast}=\tau(G_p) , and let rr be the number of edges in EEpE\setminus E_p left uncovered after solving GpG_p .

Gadget cost identity

Lemma 1 (Cost identity). For any planar graph GG with nn vertices and hub penalty P=n+1P=n+1 :

τH=τ(G).\tau^{\ast}_H = \tau(G).

Proof. Lower bound (τHτ)(\tau^{\ast}H\geq\tau^{\ast}) : any hub huvSh{uv}\in S costs P=n+1>nτP=n+1>n\geq\tau^{\ast} , so the optimal MIDS SS^{\ast} contains no hub. For each hub absent from SS^{\ast} , both (u,0)(u,0) and (v,0)(v,0) must dominate it, giving a valid cover CSC_{S^{\ast}} with cost(S)=CSτ\mathrm{cost}(S^{\ast})=|C_{S^{\ast}}|\geq\tau^{\ast} . Upper bound (τHτ)(\tau^{\ast}_H\leq\tau^{\ast}) : given an optimal cover CC^{\ast} of size τ\tau^{\ast} , define S=(v,0):vC(v,1):vCS^{\ast}={(v,0):v\in C^{\ast}}\cup{(v,1):v\notin C^{\ast}} . Every hub is dominated because CC^{\ast} covers every edge; every variable-edge node is dominated by its companion. Thus cost(S)=C=τ\mathrm{cost}(S^{\ast})=|C^{\ast}|=\tau^{\ast} and τHτ\tau^{\ast}_H\leq\tau^{\ast} . ∎

(1+ε)(1+\varepsilon) -approximation on planar graphs

Theorem 2 (Planar approximation guarantee). For any planar graph GG and any ε(0,1]\varepsilon\in(0,1] , Algorithm find_vertex_cover returns a vertex cover CC with C(1+ε)τ|C|\leq(1+\varepsilon)\,\tau^{\ast} . With the default ε=0.1\varepsilon=0.1 , the algorithm achieves a 1.11.1 -approximation on planar inputs.

Proof. By Lemma 1, τH=τ\tau^{\ast}_H=\tau^{\ast} . Baker's PTAS with k=1/εk=\lceil 1/\varepsilon\rceil returns SS with:

cost(S)k+1kτH=(1+ε)τ.\mathrm{cost}(S)\le\frac{k+1}{k}\,\tau^{\ast}_H = (1+\varepsilon)\,\tau^{\ast}.

Since every hub costs P=n+1P=n+1 and (1+ε)τ2n0(1+\varepsilon)\tau^{\ast}\leq 2n0 , Baker's solution contains no hubs, so C=v:(v,0)SC={v:(v,0)\in S} covers every edge, making the repair loop vacuous. Pruning can only decrease C|C| , giving C(1+ε)τ|C|\leq(1+\varepsilon)\tau^{\ast} . For ε=0.1\varepsilon=0.1 this is 1.11.1 . ∎

Non-planar case

Proposition 1 (Subgraph optimality). τpτ\tau_p^{\ast}\leq\tau^{\ast} , since any cover of GG is also a cover of GpGG_p\subseteq G .

Proposition 2 (Non-planar ratio). For any input GG :

C    (1+ε)τ+r,|C| \;\leq\; (1+\varepsilon)\,\tau^{\ast} + r,

where rmEpm(n1)r\leq m-|E_p|\leq m-(n-1) is the number of uncovered removed edges. In particular, C(2+ε)τ|C|\leq(2+\varepsilon)\,\tau^{\ast} whenever rτr\leq\tau^{\ast} .

Proof. By Propositions 1 and Theorem 2 applied to GpG_p : C(1+ε)τp+r(1+ε)τ+r|C|\leq(1+\varepsilon)\tau_p^{\ast}+r\leq(1+\varepsilon)\tau^{\ast}+r . The repair adds at most one vertex per uncovered removed edge. ∎

In practice rr is negligible, as most removed edges are already covered by the planar-subgraph solution. The experimental data confirms ratios well below 2+ε2+\varepsilon across all tested instances.

The 1.41.4 conjecture and the 2\sqrt{2} inapproximability threshold

The following algebraic observation is central to the paper's main theoretical consequence.

Proposition 3 (1.4=2ε0)(1.4=\sqrt{2}-\varepsilon_0) . The value 75=1.4\tfrac{7}{5}=1.4 satisfies:

75=2ε0,ε0=2750.0142>0.\frac{7}{5} = \sqrt{2} - \varepsilon_0, \qquad \varepsilon_0 = \sqrt{2} - \frac{7}{5} \approx 0.0142 > 0.

In particular, 1.4<21.4 < \sqrt{2} .

Empirically, across all thirteen benchmark instances, the ratio C/τ|C|/\tau^{\ast} lies in [1.011,1.138][1.011,\,1.138] . We formalise the empirical finding as a conjecture.

Conjecture 1 (Universal 1.41.4 -approximation). For every undirected graph GG , Algorithm find_vertex_cover with the default ε=0.1\varepsilon=0.1 returns a vertex cover CC with:

C    75τ(G)  =  (2ε0)τ(G),ε0=275>0.|C| \;\leq\; \frac{7}{5}\,\tau(G) \;=\; (\sqrt{2}-\varepsilon_0)\,\tau(G), \qquad \varepsilon_0 = \sqrt{2}-\tfrac{7}{5} > 0.

Conjecture 1 is conservative: the largest empirical ratio observed is 1.1381.41.138\ll 1.4 . The value 1.41.4 is the algebraically natural threshold because 1.4=2ε01.4=\sqrt{2}-\varepsilon_0 places the conjectured ratio strictly below 2\sqrt{2} .

Implication for P vs NP

Theorem 3 (P vs NP implication). If Algorithm find_vertex_cover with ε=0.1\varepsilon=0.1 achieves an approximation ratio at most 1.4=2ε01.4=\sqrt{2}-\varepsilon_0 on every undirected graph, then P=NP\mathrm{P}=\mathrm{NP} .

Proof. Khot, Minzer, and Safra proved the 2-to-2 games conjecture and derived from it that, for every δ>0\delta>0 , approximating Minimum Vertex Cover within a factor of (2δ)(\sqrt{2}-\delta) is NP-hard. Since 1.4=2ε01.4=\sqrt{2}-\varepsilon_0 with ε0>0\varepsilon_0>0 , the ratio 1.41.4 is exactly of the form 2δ\sqrt{2}-\delta for δ=ε0\delta=\varepsilon_0 . The algorithm runs in polynomial time (Theorem 4). A universal ratio 1.4\leq 1.4 therefore constitutes a polynomial-time (2ε0)(\sqrt{2}-\varepsilon_0) -approximation, contradicting the 2-to-2 games NP-hardness result unless P=NP\mathrm{P}=\mathrm{NP} . ∎

Theorem 3 is direct and unconditional: it requires no unproved complexity hypothesis beyond PNP\mathrm{P}\neq\mathrm{NP} , and in particular does not use the Unique Games Conjecture. Three layers of evidence support Conjecture 1: (i) Theorem 2 establishes ratio 1.11.41.1\leq 1.4 for all planar inputs provably; (ii) on every tested instance the ratio lies in [1.011,1.138][1,1.4)[1.011,\,1.138]\subset[1,\,1.4) , with the maximum 1.1381.138 already strictly below the weaker Dinur–Safra threshold 1.3606\approx 1.3606 ; (iii) the explicit constant ε00.0142\varepsilon_0\approx 0.0142 makes the conjecture algebraically precise rather than asymptotic. Additionally, under the Unique Games Conjecture, Khot and Regev showed that no polynomial-time algorithm achieves ratio better than 2δ2-\delta for any fixed δ>0\delta>0 ; a proved universal ratio of 1.4<21.4<2 would simultaneously refute the Unique Games Conjecture.


Runtime Analysis

Theorem 4 (Time complexity). Algorithm find_vertex_cover runs in time O(mn)O(mn) , where n=Vn=|V| and m=Em=|E| . On planar inputs the running time reduces to O(n+m)O(n+m) .

Proof sketch.

  • Preprocessing: removing self-loops and isolated vertices costs O(n+m)O(n+m) .
  • Planarity test on GG : the LR-planarity test runs in O(n+m)O(n+m) .
  • Gadget construction (planar path): adding 2n2n variable nodes, mm hub nodes, and n+2mn+2m edges costs O(n+m)O(n+m) .
  • Baker's PTAS on HH (planar path): the gadget has O(n+m)O(n+m) vertices. Baker's algorithm with k=1/ε=10k=\lceil 1/\varepsilon\rceil=10 performs BFS layering, removes every (k+1)(k+1) -th layer to obtain subgraphs of treewidth at most 3k3k , and runs a DP at cost O(33kV(H))=O(C(n+m))O(3^{3k}\cdot|V(H)|)=O(C\cdot(n+m)) for the constant C=330C=3^{30} ; since kk is fixed, this is O(n+m)O(n+m) .
  • Maximal planar subgraph (non-planar path): build a spanning forest in O(mα(n))O(m\,\alpha(n)) via Union-Find; one initial planarity call in O(n+m)O(n+m) ; then for each non-tree edge, either an O(n)O(n) face-traversal check (face-sharing case, O(1)O(1) embedding update) or one O(n+m)O(n+m) planarity call (uncertain case, at most O(n)O(n) times for accepted edges). Total planarisation cost: O(mn)O(mn) .
  • Repair and pruning: each iterates over at most mm edges at O(1)O(1) per edge: O(m)O(m) each.

The dominant cost for non-planar inputs is O(mn)O(mn) ; on planar inputs the algorithm costs O(n+m)O(n+m) . On sparse graphs ( m=O(n)m=O(n) ) the non-planar bound reduces to O(n2)O(n^2) . ∎


Experimental Study

We evaluated Salvador v0.0.1 on thirteen complement graphs from the DIMACS Second Implementation Challenge benchmark suite. These graphs range in size from n=125n=125 to n=3321n=3321 and in average degree m/nm/n from about 1.861.86 (MANN family) up to around 5050 (brock200_2). For each instance we computed:

  • The vertex cover size C|C| returned by find_vertex_cover.
  • A reference optimum τ=nω\tau^{\ast}=n-\omega^{\ast} , where ω\omega^{\ast} is the best known maximum clique size of the complement graph (by Gallai's theorem, the minimum vertex cover of the complement equals nωn-\omega^{\ast} ). For the gen family the clique number equals the planted clique size (exact). For the brock and keller4 instances ω\omega^{\ast} is a best-known bound, so the reported τ\tau^{\ast} is an upper bound on the true optimum (entries marked ^\dagger ; the true ratio can only be at most as large as shown).

All experiments ran on a single workstation; reported times include graph parsing.

Approximation ratios

Instance nn mm m/nm/n τ\tau^{\ast} C\lvert C \rvert C/τ\lvert C\rvert/\tau^{\ast} TST_{\text{S}} (s)
C125.9 125 787 6.30 91 95 1.044 1.50
C250.9 250 3141 12.56 206 216 1.049 11.38
brock200_2 200 10024 50.12 \leq 188 ^\dagger 192 \leq 1.021 ^\dagger 34.25
brock200_4 200 6811 34.06 \leq 183 ^\dagger 187 \leq 1.022 ^\dagger 22.93
gen200_p0.9_44 200 1990 9.95 156 167 1.071 6.59
gen200_p0.9_55 200 1990 9.95 145 165 1.138 7.32
gen400_p0.9_55 400 7980 19.95 345 364 1.055 83.97
gen400_p0.9_65 400 7980 19.95 335 361 1.078 74.10
gen400_p0.9_75 400 7980 19.95 325 357 1.098 83.59
keller4 171 5100 29.82 \leq 160 ^\dagger 163 \leq 1.019 ^\dagger 24.08
MANN_a27 378 702 1.86 252 258 1.024 3.24
MANN_a45 1035 1980 1.91 690 703 1.019 20.86
MANN_a81 3321 6480 1.95 2214 2238 1.011 180.91

Summary: min ratio = 1.0111.011 , mean = 1.0491.049 , median = 1.0441.044 , max = 1.1381.138 .

Every instance produces a ratio strictly below the 2\sqrt{2} barrier ( 1.414\approx 1.414 ), the weaker Dinur–Safra threshold ( 1.3606\approx 1.3606 ), and the conjectured bound of 1.41.4 . The maximum observed ratio is 1.1381.138 (gen200_p0.9_55), the mean is 1.0491.049 , and the median is 1.0441.044 . The smallest ratio, 1.0111.011 , is achieved on MANN_a81, the largest tested instance. The ^\dagger -marked entries have ratios at most 1.0221.022 even using the conservative upper-bounded reference optimum; the true ratios are smaller still.

Running time analysis

The running times span three regimes classified by average degree m/nm/n :

Sparse ( m/n<5m/n < 5 : MANN family). Baker's PTAS dominates and the planarisation step is cheap, as m2nm\approx 2n implies O(mn)=O(n2)O(mn)=O(n^2) . From MANN_a27 ( n=378n=378 , TS=3.24T_{\text{S}}=3.24 s) to MANN_a45 ( n=1035n=1035 , TS=20.86T_{\text{S}}=20.86 s) the time grows by a factor of 6.4×6.4\times , while (1035/378)27.5(1035/378)^2\approx 7.5 , consistent with O(n2)O(n^2) . From MANN_a45 to MANN_a81 ( n=3321n=3321 , TS=180.91T_{\text{S}}=180.91 s) the ratio is 8.7×8.7\times , while (3321/1035)210.3(3321/1035)^2\approx 10.3 , again consistent with sub-quadratic growth.

Medium density ( 5m/n<255\leq m/n<25 : C, gen families). The gen400 instances ( m/n20m/n\approx 20 , n=400n=400 ) take 74–84 s, consistent with O(mn)O(mn) scaling. The C125.9 instance ( m=787m=787 ) finishes in 1.5 s; the three gen200 instances ( m=1990m=1990 ) finish in 6–12 s, and C250.9 ( m=3141m=3141 ) in 11.4 s.

Dense ( m/n25m/n\geq 25 : brock, keller). brock200_2 ( m=10024m=10024 , m/n=50.12m/n=50.12 ) is the hardest: 34.25 s, consistent with O(mn)=O(10024×200)O(mn)=O(10024\times 200) . Despite this density, the approximation ratio ( 1.021\leq 1.021 ) is the second-best in the table, reflecting that highly dense non-planar graphs reduce quickly to small planar skeletons.

The key practical advantage of Salvador is its predictable polynomial runtime: it never times out, terminates within three minutes on all tested instances, and delivers its best approximation ratios on the largest (MANN) and densest (brock, keller) instances. The worst-case ratio ( 1.1381.138 , gen200_p0.9_55) occurs at medium density, where the non-planar repair is most active. Crucially, every empirical ratio lies comfortably below both the 2\sqrt{2} inapproximability threshold and the conjectured bound of 1.41.4 .


Conclusion

We have presented a polynomial-time algorithm that reduces any graph to a planar kernel via a weighted MIDS gadget and then applies Baker's PTAS to obtain an approximate vertex cover. The algorithm provably achieves a (1+ε)(1+\varepsilon) -approximation on planar graphs — a 1.11.1 -approximation with the default ε=0.1\varepsilon=0.1 — and consistently delivers near-optimal solutions in practice.

The central observation of this paper is that 1.4=2ε01.4=\sqrt{2}-\varepsilon_0 for the explicit positive constant ε0=2750.0142\varepsilon_0=\sqrt{2}-\tfrac{7}{5}\approx 0.0142 , so Conjecture 1 places the conjectured ratio strictly below the 2\sqrt{2} inapproximability threshold. By Theorem 3, which rests on the 2-to-2 games hardness results of Khot, Minzer, and Safra, any proof of Conjecture 1 implies P=NP\mathrm{P}=\mathrm{NP} directly and unconditionally — without appealing to the Unique Games Conjecture or any other unproved hypothesis.

The implementation is freely available as the Salvador package (v0.0.1) on PyPI.

Top comments (0)