# Item

ITEM ACTIONSEXPORT

Released

Paper

#### Pre-reduction Graph Products: Hardnesses of Properly Learning DFAs and Approximating EDP on DAGs

##### External Resource

No external resources are shared

##### Fulltext (restricted access)

There are currently no full texts shared for your IP range.

##### Fulltext (public)

arXiv:1408.0828.pdf

(Preprint), 2MB

##### Supplementary Material (public)

There is no public supplementary material available

##### Citation

Chalermsook, P., Laekhanukit, B., & Nanongkai, D. (2014). Pre-reduction Graph Products: Hardnesses of Properly Learning DFAs and Approximating EDP on DAGs. Retrieved from http://arxiv.org/abs/1408.0828.

Cite as: https://hdl.handle.net/11858/00-001M-0000-0024-4610-2

##### Abstract

The study of graph products is a major research topic and typically concerns
the term $f(G*H)$, e.g., to show that $f(G*H)=f(G)f(H)$. In this paper, we
study graph products in a non-standard form $f(R[G*H]$ where $R$ is a
"reduction", a transformation of any graph into an instance of an intended
optimization problem. We resolve some open problems as applications.
(1) A tight $n^{1-\epsilon}$-approximation hardness for the minimum
consistent deterministic finite automaton (DFA) problem, where $n$ is the
sample size. Due to Board and Pitt [Theoretical Computer Science 1992], this
implies the hardness of properly learning DFAs assuming $NP\neq RP$ (the
weakest possible assumption).
(2) A tight $n^{1/2-\epsilon}$ hardness for the edge-disjoint paths (EDP)
problem on directed acyclic graphs (DAGs), where $n$ denotes the number of
vertices.
(3) A tight hardness of packing vertex-disjoint $k$-cycles for large $k$.
(4) An alternative (and perhaps simpler) proof for the hardness of properly
learning DNF, CNF and intersection of halfspaces [Alekhnovich et al., FOCS 2004
and J. Comput.Syst.Sci. 2008].