Using Hardness vs Randomness to Design Low-Space Algorithms

Edward Pyne, Roei Tell, The Computational Complexity Column by Michal Koucky

Abstract


Can we use “hardness vs randomness” techniques to design low-space algorithms? This
text surveys a sequence of recent works showing ways to do that. These works designed
algorithms for certified derandomization and for catalytic computation (which work unconditionally),
derandomization and isolation algorithms from remarkably mild assumptions,
and “win-win” pairs of algorithms that, for every input, solve either derandomization or
another important problem (e.g., s-t connectivity) on the input.
Underlying these constructions are new, specialized “hardness vs randomness” tools
for the setting of low-space algorithms. We describe these technical tools, most notably
constructions of pseudorandom generators whose reconstruction algorithm (i.e., the security
reduction) is a deterministic low-space algorithm. We also explain a key part of obtaining
deterministic reconstruction, which is deterministic transformations of distinguishers to
bit-predictors.
We pose a host of open questions that explore new ways of using hardness vs randomness
to design low-space algorithms. These questions address problems in derandomization,
catalytic computation, explicit constructions, learning algorithms, and more.

Full Text:

PDF

Refbacks

  • There are currently no refbacks.