Edmonds, Cobham circa 1960: Programs that use a polynomially bounded resources (time, space) are generally reasonable for moderately sized inputs. Programs that use exponential resources are generally not reasonable for moderately size inputs. Definition 1: TIME(T(n)) = { language L : There exists a Turing machine M such that (1) M accepts x iff x is in L and (2) for all but finitely many x M(x) halts in T(|x|) steps } Definition 2: TIME(T(n)) = { language L : There exists a Turing machine M and a constant b such that (1) M accepts x iff x is in L and (2) for all x M(x) halts in b*T(|x|) steps } Homework: Prove that the two definitions are equivalaent. Linear Speedup Theorem: TIME(2n^2) subset TIME(n^2) Time Hierarchy Theorem: TIME(f(n)) is a strict subset of TIME(g(n)) if f(n) \log n = O(g(n)) Proof for time and f(n)=n^2 and g(n)= n^3: Let L = {x | machine x doesn't accept x in |x|^2.5 steps} *L in TIME(n^3) by simulation *L not in TIME(n^2) by diagonalization Note that as there are infinitely many copies of any n^2 time machine in the rows of A, and thus anymachine that accepts L must run longer than n^2.5 steps infinitely often. End Proof Defintion of standard complexity classes: SPACE(T(n)), P = Union_k TIME(n^k), PSPACE= Union_k SPACE(n^k), EXPTIME = Union_k TIME(2^n^k) LogSpace. Example: 0^n1^n in LogSpace Theorem: TIME(f(n)) subset SPACE(f(n)) subset TIME(2^f(n)) proof: First inclusion is obvious. A program that uses space S(n) only has O(Sigma^S(n) possible configurations. Here Sigma is the tape alphabet size. Once a program repeats configurations then its is an infinite loop. End proof Corollary: Polynomial space is contained in exponential time Log space is contained in polynomial time Definition: A language L is complete for a complexity class C (with respect to a type of reduction Q) iff * L is in C, and * For all X in C, X is reducible to L via a reduction of type Q. So intuitively L is the hardest problem in C, where the degree of refinement in this hardness measure is given by the courseness of Q. Example of Completeness use: Say you want to prove a theorem of the form: If exponential time problem X has a poly space algorithm, then all exponential time algorithms have poly space algorithms. Question: What sort of completeness would you need for X? Answer: Polynomial space completeness since a poly space algorithm that calls another poly space algorithm runs in poly space Question: Does PSPACE have a complete problem L ? Wrong Answer: Yes, L = {(M, k, I) : M accepts I in |I|^k space} This is wrong because it is not clear that there is a unique l such that this language can be accepted in n^l space Answer: Yes, L = {(M, I, 1^|I|^k) : M accepts I in |I|^k space} Proof: In PSPACE Reduction from PSPACE problem accepted by polynomnial space time machine M that uses space n^k Program Read I Call program for L with input (M, I, 1^|I|^k) Question: What is the unique l such that this language can be accepted in n^l space? Theorem: Circuit Value Problem is log space complete for P Proof: Consider a particular polynomial Turing machine and its computation history. Show how to simulate the computation with a circuit Theorem: Quantifiable Boolean Formula (QBF) is PSPACE-complete proof: QBF in PSPACE Show L is reducible to QBF Try 1: THEREEXISTS x such that x is computation history showing the M accepts I using less than |I|^k space Problem: x is expoentially large Try 2: Phi_i(A,B) = configuration B of M is reachable from configuration A of M in 2^i steps Question: How to recursively define Phi_i There is a formula for Phi_0(A,B), which is reachability in one step. Question: What is wrong with Phi_{i+1}(A, B) = THEREEXISTS C Phi_i(A,C) and Phi_i(C,B) Answer: Grows exponentially. May be as large as the time used by M, which may be exponential in |I| Try 3: Phi_{i+1}(A, B) = THEREEXISTS C FORALL X, Y (X=A and Y=C) or (X=C and Y=B) implies Phi_i(X, Y) Fact: Determining whether one player has a winning strategy in obvious generalizations of common two person games is often PSPACE-complete.