Manuel Sabin

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2020-165

August 14, 2020

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2020/EECS-2020-165.pdf

The nascent field of Fine-Grained Complexity Theory has emerged and grown rapidly in the past decade. By studying ``Hardness within P'' and the connections of problems computable in, say, $n^{2}$ time versus $n^{3}$ time, this field addresses the practical efficiency of problems. However, while this more deeply quantitative approach better addresses practical hardness problem-by-problem, we lose connection to key qualitative claims of classical complexity theory such as the general ability to hide secrets (Cryptography), the ability to show that a world where we have access to randomness is no more powerful computationally than a deterministic one (Derandomization), and the ability to take a problem that is hard in the worst-case scenario and turn it into one that is hard almost always (Hardness Amplification).

We show that these connections can be recovered and that the problem-specific claims of Fine-Grained Complexity Theory cannot exist in a vacuum without ramifications to classical structural Complexity Theory. Namely, we show that the core worst-case hardness assumptions that define Fine-Grained Complexity Theory yield:

Hardness Amplification: We attain Fine-Grained problems that are hard on average (Ball et al., STOC '17). By achieving average-case hardness within the Fine-Grained world, we use this as a stepping stone to achieve both Cryptographic primitives and Derandomization.

Cryptography: We obtain the first Proofs of Work (PoWs) from worst-case complexity assumptions, thus finally placing these fundamental primitives on a rigorous theoretical foundation (Ball et al., CRYPTO '18). We further propose the concept of Fine-Grained Cryptography and this call has now been answered in (LaVigne et al., CRYPTO '20) where some progress is made towards achieving Public-Key Fine-Grained Cryptography.

Derandomization: We construct complexity-theoretic Pseudorandom Generators (Carmosino et al., ICALP '18). This both achieves the best known derandomizations from uniform assumptions as well as connects the problem-centric Fine-Grained Complexity to the resource-centric study in Complexity Theory of randomness as a resource.

Advisors: Shafi Goldwasser


BibTeX citation:

@phdthesis{Sabin:EECS-2020-165,
    Author= {Sabin, Manuel},
    Title= {On The Utility of Fine-Grained Complexity Theory},
    School= {EECS Department, University of California, Berkeley},
    Year= {2020},
    Month= {Aug},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2020/EECS-2020-165.html},
    Number= {UCB/EECS-2020-165},
    Abstract= {The nascent field of Fine-Grained Complexity Theory has emerged and grown rapidly in the past decade.  By studying ``Hardness within P'' and the connections of problems computable in, say, $n^{2}$ time versus $n^{3}$ time, this field addresses the practical efficiency of problems. However, while this more deeply quantitative approach better addresses practical hardness problem-by-problem, we lose connection to  key qualitative claims of classical complexity theory such as the general ability to hide secrets (Cryptography), the ability to show that a world where we have access to randomness is no more powerful computationally than a deterministic one (Derandomization), and the ability to take a problem that is hard in the worst-case scenario and turn it into one that is hard almost always (Hardness Amplification).

We show that these connections can be recovered and that the problem-specific claims of Fine-Grained Complexity Theory cannot exist in a vacuum without ramifications to classical structural Complexity Theory.  Namely, we show that the core worst-case hardness assumptions that define Fine-Grained Complexity Theory yield:

Hardness Amplification: We attain Fine-Grained problems that are hard on average (Ball et al., STOC '17).  By achieving average-case hardness within the Fine-Grained world, we use this as a stepping stone to achieve both Cryptographic primitives and Derandomization.

Cryptography:  We obtain the first Proofs of Work (PoWs) from worst-case complexity assumptions, thus finally placing these fundamental primitives on a rigorous theoretical foundation (Ball et al., CRYPTO '18).  We further propose the concept of Fine-Grained Cryptography and this call has now been answered in  (LaVigne et al., CRYPTO '20) where some progress is made towards achieving Public-Key Fine-Grained Cryptography. 

Derandomization: We construct complexity-theoretic Pseudorandom Generators  (Carmosino et al., ICALP '18).  This both achieves the best known derandomizations  from uniform assumptions as well as connects the problem-centric Fine-Grained Complexity to the resource-centric study in Complexity Theory of randomness as a resource.},
}

EndNote citation:

%0 Thesis
%A Sabin, Manuel 
%T On The Utility of Fine-Grained Complexity Theory
%I EECS Department, University of California, Berkeley
%D 2020
%8 August 14
%@ UCB/EECS-2020-165
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2020/EECS-2020-165.html
%F Sabin:EECS-2020-165