Turing Machines and Computability

Introduction

Turing machines represent the theoretical limit of what can be computed mechanically. Proposed by Alan Turing in 1936, they formalize the intuitive notion of an algorithm and provide a mathematical framework for studying the fundamental limits of computation. This reading explores Turing machines, computability theory, undecidability, and computational complexity.

Learning Objectives

By the end of this reading, you will be able to:

  • Define and construct Turing machines
  • Understand variants of Turing machines and their equivalence
  • Comprehend the Church-Turing thesis and its implications
  • Distinguish between decidable and undecidable problems
  • Understand the Halting Problem and its proof
  • Recognize reductions as a technique for proving undecidability
  • Understand complexity classes P and NP
  • Comprehend NP-completeness and its significance
  • Apply these concepts to analyze computational problems

Turing Machines: Formal Definition

The Model

A Turing machine (TM) is a theoretical computing device with:

  • An infinite tape divided into cells
  • A read/write head that can move left or right
  • A finite set of states
  • A transition function that determines behavior

Unlike finite automata (read-only, finite input) and pushdown automata (stack-based memory), Turing machines have unlimited memory and can both read and write.

Formal Definition

A Turing machine is a 7-tuple M = (Q, Σ, Γ, δ, q₀, q_accept, q_reject) where:

  1. Q is a finite set of states
  2. Σ is the input alphabet (not containing the blank symbol ␣)
  3. Γ is the tape alphabet, where ␣ ∈ Γ and Σ ⊆ Γ
  4. δ: Q × Γ → Q × Γ × {L, R} is the transition function
  5. q₀ ∈ Q is the start state
  6. q_accept ∈ Q is the accept state
  7. q_reject ∈ Q is the reject state (q_reject ≠ q_accept)

Transition function: δ(q, a) = (p, b, D) means:

  • In state q, reading symbol a
  • Write symbol b, move in direction D (L for left, R for right)
  • Enter state p

Computation

Configuration: A snapshot of the TM's state, tape contents, and head position.

Notation: uqv represents:

  • State: q
  • Tape contents: uv (with u to the left of the head, v starting at the head)
  • Head position: at the first symbol of v

Computation: A sequence of configurations:

C₀ ⊢ C₁ ⊢ C₂ ⊢ ... ⊢ Cₙ

Acceptance: M accepts input w if there exists a sequence of configurations:

q₀w ⊢* u_accept v

where u_accept contains q_accept.

Rejection: M rejects w if it halts in q_reject.

Loops: M loops on w if the computation never halts.

Languages Recognized by Turing Machines

The language recognized by M is:

L(M) = {w : M accepts w}

A language is Turing-recognizable (or recursively enumerable) if some TM recognizes it.

A language is decidable (or recursive) if some TM recognizes it and halts on all inputs.

Decidable vs. Recognizable:

  • Decidable: Always halts (accepts or rejects)
  • Recognizable: Halts on accepted strings, may loop on rejected strings

Example 1: TM for L = {0ⁿ1ⁿ : n ≥ 0}

Strategy:

  1. Sweep left to right, crossing off one 0 and one 1
  2. Repeat until all symbols are crossed off
  3. Accept if successful, reject otherwise

Informal algorithm:

1. Scan right, cross off leftmost 0, remember it
2. Continue right, cross off leftmost 1
3. Move head back to the left end
4. Repeat steps 1-3
5. If no 0s remain, check if no 1s remain
   - If yes: accept
   - If no: reject

Formal specification:

  • Q = {q₀, q₁, q₂, q₃, q₄, q_accept, q_reject}
  • Σ = {0, 1}
  • Γ = {0, 1, x, ␣}
  • Transitions (partial):
State  Read  →  Write  Move  New State
q₀     0         x      R     q₁          // Cross off 0
q₀     x         x      R     q₀          // Skip crossed 0s
q₀     ␣         ␣      R     q_accept    // No more 0s, check done
q₁     0         0      R     q₁          // Search for 1
q₁     1         x      L     q₂          // Found 1, cross off
q₁     x         x      R     q₁          // Skip crossed 1s
q₁     ␣         ␣      R     q_reject    // No 1 to match
q₂     x         x      L     q₂          // Go left
q₂     0         0      L     q₂          // Continue left
q₂     ␣         ␣      R     q₀          // Back at start
...

Trace for "0011":

q₀0011␣     // Initial configuration
xq₁011␣     // Cross off first 0
x0q₁11␣     // Move right
x01q₁1␣     // Move right
x0q₂x1␣     // Cross off first 1, go left
xq₂0x1␣     // Move left
q₂x0x1␣     // Move left
␣q₂x0x1␣    // At left end (conceptually)
␣xq₀0x1␣    // Back to start, move right
␣xxq₁x1␣    // Cross off second 0
...
␣xxq_acceptxx␣   // All matched, accept

Example 2: TM for {w#w : w ∈ {0,1}*}

This language consists of strings with a # separator where both halves are identical.

Strategy:

  1. Mark the first symbol on left side
  2. Find the corresponding position on right side
  3. Check if symbols match
  4. Repeat for all symbols

Challenges:

  • Need to "remember" which symbol we're matching
  • Need to track position on both sides
  • Use states to encode remembered information

States:

  • q₀: start
  • q_{0,i}, q_{1,i}: remembered 0 or 1, moving right to position i
  • q_{check}: verifying match
  • q_{return}: moving back left

This requires careful state design and is more complex than previous examples.

Example 3: TM as a Function Computer

TM computing f(n) = 2n (on unary representation):

Input: 1ⁿ (n ones) Output: 1²ⁿ (2n ones)

Strategy:

  1. For each 1 in input, write two 1s in output area
  2. Use tape as: [input area][separator][output area]

Transitions:

q₀: Move right to end of input, remember length
q₁: For each input 1, go to output area and write two 1s
q₂: Return to input area
...

Variants of Turing Machines

Many variants of TM exist, but they all have the same computational power.

Multitape Turing Machine

Definition: k tapes, each with its own read/write head.

Transition: δ(q, a₁, ..., aₖ) = (p, b₁, ..., bₖ, D₁, ..., Dₖ)

  • Read aᵢ from tape i
  • Write bᵢ to tape i
  • Move head i in direction Dᵢ

Advantage: More natural for some algorithms, faster simulation time.

Theorem: Every multitape TM has an equivalent single-tape TM.

Proof sketch:

  • Use single tape to store contents of all k tapes, separated by special symbols
  • Use special markers to indicate head positions
  • Simulate each step of k-tape TM by making multiple passes on single tape

Time complexity: If k-tape TM runs in time T(n), single-tape simulation runs in time O(T(n)²).

Nondeterministic Turing Machine

Definition: Transition function is:

δ: Q × Γ → P(Q × Γ × {L, R})

Multiple possible transitions at each step.

Acceptance: Accepts if any computation path accepts.

Theorem: Every nondeterministic TM has an equivalent deterministic TM.

Proof idea: Use breadth-first search to explore all computation paths.

Complexity: If NTM runs in time T(n), DTM simulation may take exponential time 2^O(T(n)).

Enumerator

An enumerator is a TM with a printer. It outputs strings, one at a time.

Language enumerated: E = {w : enumerator eventually prints w}

Theorem: A language is Turing-recognizable iff some enumerator enumerates it.

Proof:

  • (⇒) Given TM M recognizing L, build enumerator that runs M on s₁, s₂, s₃, ... in dovetailing fashion
  • (⇐) Given enumerator E, build TM that simulates E until it prints the input string

Two-Way Infinite Tape

Modification: Tape extends infinitely in both directions.

Theorem: Equivalent to standard TM.

Proof: Use two tracks on standard tape to simulate left and right sides.

Multiple Heads

Modification: Multiple read/write heads on same tape.

Theorem: Equivalent to standard TM.

Proof: Similar to multitape simulation.

Two-Dimensional Tape

Modification: Tape is a 2D grid.

Theorem: Equivalent to standard TM.

Proof: Encode 2D position as 1D position using pairing function.

The Church-Turing Thesis

Informal Computation

An algorithm is an informal notion: a finite, mechanical procedure for solving a problem.

Examples of informal computational models:

  • Lambda calculus (Church, 1936)
  • Recursive functions (Gödel, Kleene)
  • Register machines
  • Random access machines (RAM)
  • Modern programming languages

The Thesis

Church-Turing Thesis: The intuitive notion of an algorithm corresponds exactly to Turing machine computation.

Formally: Any function computable by any reasonable mechanical device can be computed by a Turing machine.

Not a theorem: Cannot be proven because "algorithm" is informal.

Evidence:

  1. All proposed computational models are equivalent to TMs
  2. No counter-examples in 80+ years
  3. Mathematical elegance and simplicity

Implications

  1. Universality: There exists a universal Turing machine U that can simulate any TM M on any input w

    • Input to U: <M, w> (encoding of M and w)
    • Output: Same as M(w)
  2. Limits of computation: Some problems cannot be solved by any algorithm (not just current technology)

  3. Programming languages: All Turing-complete languages have same computational power

  4. Complexity theory: We can study inherent difficulty of problems

Decidability

Decidable Languages

A language L is decidable if there exists a TM M that:

  1. Accepts all w ∈ L
  2. Rejects all w ∉ L
  3. Halts on all inputs

Example decidable languages:

  • A_DFA = {<D, w> : D is a DFA that accepts w}
  • A_NFA = {<N, w> : N is an NFA that accepts w}
  • A_REX = {<R, w> : R is a regex that matches w}
  • E_DFA = {<D> : D is a DFA and L(D) = ∅}
  • EQ_DFA = {<D₁, D₂> : D₁ and D₂ are DFAs and L(D₁) = L(D₂)}

Proofs: Construct TMs that simulate the relevant automata.

Example: A_DFA is Decidable

Theorem: A_DFA = {<D, w> : D is a DFA that accepts w} is decidable.

Proof: Construct TM M:

M = "On input <D, w>:
1. Simulate D on w
2. If D accepts w, accept
3. If D rejects w, reject"

M always halts, so A_DFA is decidable. ∎

Example: E_DFA is Decidable

Theorem: E_DFA = {<D> : D is a DFA and L(D) = ∅} is decidable.

Proof: Construct TM M:

M = "On input <D>:
1. Mark the start state of D
2. Repeat until no new states marked:
   - Mark any state reachable from a marked state
3. If any marked state is an accept state:
   - Reject (language is non-empty)
4. If no marked state is an accept state:
   - Accept (language is empty)"

This is graph reachability, which always terminates. ∎

Decidability for CFGs

Theorem: A_CFG = {<G, w> : G is a CFG that generates w} is decidable.

Proof: Use CYK algorithm (O(n³)) or exhaustive search with bound.

Theorem: E_CFG = {<G> : G is a CFG and L(G) = ∅} is decidable.

Proof: Check if start variable can derive any terminal string.

Contrast: EQ_CFG = {<G₁, G₂> : L(G₁) = L(G₂)} is undecidable.

The Halting Problem

Statement of the Problem

Halting Problem: Given a TM M and input w, does M halt on w?

Formally: HALT = {<M, w> : M is a TM that halts on input w}

Theorem: HALT is undecidable.

This is the most famous undecidable problem and demonstrates fundamental limits of computation.

Proof of Undecidability

We prove a related problem first:

Theorem: A_TM = {<M, w> : M is a TM that accepts w} is undecidable.

Proof (by diagonalization):

Assume A_TM is decidable. Then there exists a TM H that decides A_TM:

H(<M, w>) = accept if M accepts w
          = reject if M does not accept w

Construct a new TM D:

D = "On input <M>:
1. Run H on input <M, <M>>
2. If H accepts, reject
3. If H rejects, accept"

Now consider what happens when we run D on its own encoding <D>:

Case 1: D accepts <D>

  • Then H rejects <D, <D>>
  • So D does not accept <D>
  • Contradiction!

Case 2: D rejects <D>

  • Then H accepts <D, <D>>
  • So D accepts <D>
  • Contradiction!

Both cases lead to contradiction, so our assumption that H exists must be false.

Therefore, A_TM is undecidable. ∎

Corollary: HALT is undecidable.

Proof: If HALT were decidable, we could decide A_TM by:

  1. First check if M halts on w using HALT decider
  2. If yes, simulate M on w and see if it accepts
  3. If no, reject

This would decide A_TM, contradicting the theorem. ∎

Implications

  1. No universal debugger: Cannot build a program that detects infinite loops in all programs

  2. No perfect virus detector: Cannot build a program that detects all viruses

  3. Limits of verification: Cannot automatically verify all program properties

  4. Incompleteness: Related to Gödel's incompleteness theorems

The Acceptance Problem

Definition: A_TM = {<M, w> : M is a TM that accepts w}

Theorem: A_TM is Turing-recognizable but not decidable.

Proof that A_TM is recognizable: Construct TM U (universal TM):

U = "On input <M, w>:
1. Simulate M on w
2. If M accepts, accept
3. If M rejects, reject"

U recognizes A_TM, but may loop if M loops on w.

Contrast with decidability: No TM can decide A_TM (must halt on all inputs).

Reductions and Undecidability

Mapping Reductions

A mapping reduction from language A to language B is a computable function f: Σ* → Σ* such that:

w ∈ A ⟺ f(w) ∈ B

Notation: A ≤_m B (A reduces to B)

Intuition: If we can decide B, we can decide A by:

  1. Compute f(w)
  2. Check if f(w) ∈ B

Using Reductions to Prove Undecidability

Theorem: If A ≤_m B and A is undecidable, then B is undecidable.

Proof (by contradiction):

  • Assume B is decidable with decider M_B
  • Construct decider for A:
    M_A = "On input w:
    1. Compute f(w)
    2. Run M_B on f(w)
    3. Output what M_B outputs"
    
  • This decides A, contradicting undecidability of A
  • Therefore B must be undecidable ∎

Example: E_TM is Undecidable

Problem: E_TM = {<M> : L(M) = ∅}

Theorem: E_TM is undecidable.

Proof: Reduce A_TM to E_TM.

Given <M, w>, construct TM M':

M' = "On input x:
1. If x ≠ w, reject
2. If x = w, run M on w and accept if M accepts"

Observe:

  • If M accepts w, then L(M') = {w} ≠ ∅
  • If M doesn't accept w, then L(M') = ∅

Therefore:

<M, w> ∈ A_TM ⟺ <M'> ∉ E_TM

This is almost a reduction, but needs to be inverted. We can show:

<M, w> ∈ A_TM ⟺ <M'> ∉ E_TM ⟺ <M'> ∈ E̅_TM

So A_TM ≤_m E̅_TM, and since co-undecidability implies undecidability, E_TM is undecidable. ∎

Example: REGULAR_TM is Undecidable

Problem: REGULAR_TM = {<M> : L(M) is regular}

Theorem: REGULAR_TM is undecidable.

Proof: Reduce A_TM to REGULAR_TM.

Given <M, w>, construct M':

M' = "On input x:
1. If x has form 0ⁿ1ⁿ, accept
2. Otherwise, run M on w
   - If M accepts w, accept
   - If M rejects w, reject"

Observe:

  • If M accepts w: L(M') = Σ* (which is regular)
  • If M doesn't accept w: L(M') = {0ⁿ1ⁿ} (which is not regular)

Therefore:

<M, w> ∈ A_TM ⟺ <M'> ∈ REGULAR_TM

So A_TM ≤_m REGULAR_TM, proving REGULAR_TM is undecidable. ∎

Rice's Theorem

Theorem (Rice's Theorem): Any non-trivial property of the language recognized by a Turing machine is undecidable.

Formally: Let P be a property of Turing-recognizable languages. If:

  1. P is non-trivial: some TMs have property P and some don't
  2. P depends only on the language, not the TM description

Then {<M> : L(M) has property P} is undecidable.

Examples of undecidable properties:

  • Is L(M) empty?
  • Is L(M) finite?
  • Is L(M) regular?
  • Is L(M) context-free?
  • Does L(M) contain string w?
  • Is L(M₁) = L(M₂)?
  • Is L(M₁) ⊆ L(M₂)?

Exception: Trivial properties like "Is M a TM?" are decidable.

Computational Complexity

Time Complexity

Definition: The time complexity of TM M is the function T: ℕ → ℕ where T(n) is the maximum number of steps M takes on any input of length n.

Big-O notation: M runs in time O(f(n)) if there exist c, n₀ such that for all n ≥ n₀: T(n) ≤ c·f(n).

Example: Checking if string is a palindrome can be done in O(n²) on single-tape TM, O(n) on multitape TM.

The Class P

Definition: P is the class of languages decidable in polynomial time on a deterministic TM.

Formally:

P = ⋃_{k≥0} TIME(nᵏ)

where TIME(f(n)) = {L : L is decidable by O(f(n)) time TM}

Examples of problems in P:

  • PATH = {<G, s, t> : G has a path from s to t} [O(|V| + |E|)]
  • PRIMES = {n : n is prime} [O((log n)⁶) via AKS primality test]
  • LINEAR_PROGRAMMING [polynomial via ellipsoid/interior-point]

Significance:

  • Robust across computational models
  • Polynomial time on single-tape TM ⟺ polynomial time on multitape TM
  • Polynomial time on TM ⟺ polynomial time on RAM model
  • Generally considered "efficiently solvable"

Nondeterministic Time Complexity

Definition: NTIME(f(n)) is the class of languages decidable by O(f(n)) time nondeterministic TM.

Acceptance: NTM accepts if any computation path accepts.

The Class NP

Definition: NP is the class of languages decidable in polynomial time on a nondeterministic TM.

Formally:

NP = ⋃_{k≥0} NTIME(nᵏ)

Alternative definition (verifier-based):

L ∈ NP if there exists a polynomial-time TM V (verifier) such that:

w ∈ L ⟺ ∃ certificate c (|c| ≤ poly(|w|)) such that V accepts <w, c>

Examples of problems in NP:

  • SAT = {<φ> : φ is a satisfiable Boolean formula}
  • HAMPATH = {<G, s, t> : G has a Hamiltonian path from s to t}
  • CLIQUE = {<G, k> : G has a clique of size k}
  • SUBSET-SUM = {<S, t> : S has a subset summing to t}
  • COMPOSITES = {n : n is composite}

Relationship: P ⊆ NP

Proof: Any polynomial-time DTM is also a polynomial-time NTM (with only one computation path).

The P vs. NP Problem

Question: Is P = NP?

Significance:

  • Most important open problem in computer science
  • Clay Mathematics Institute Millennium Prize ($1,000,000)
  • Open since 1971

Implications if P = NP:

  • Polynomial-time algorithms exist for all NP problems
  • Cryptography would collapse
  • Optimization becomes easy
  • Scientific revolution

Current belief: Most researchers believe P ≠ NP, but no proof exists.

NP-Completeness

Definition: A language B is NP-complete if:

  1. B ∈ NP
  2. For every A ∈ NP, A ≤_p B (polynomial-time reduction)

Significance: NP-complete problems are the "hardest" problems in NP.

Theorem: If any NP-complete problem is in P, then P = NP.

Proof: Let B be NP-complete and in P. For any A ∈ NP:

  • A ≤_p B (by NP-completeness)
  • B ∈ P (by assumption)
  • Therefore A ∈ P (polynomial time reduction + polynomial time solution)

Cook-Levin Theorem

Theorem (Cook-Levin, 1971): SAT is NP-complete.

SAT: Given Boolean formula φ, is there an assignment making φ true?

Proof sketch:

  1. SAT ∈ NP: Certificate is satisfying assignment, verifiable in polynomial time
  2. For any L ∈ NP with polynomial-time NTM M:
    • Given input w, construct formula φ that simulates M on w
    • φ is satisfiable ⟺ M accepts w
    • Construction is polynomial-time

This provides the first NP-complete problem.

Examples of NP-Complete Problems

Once we have one NP-complete problem, we can prove others via reduction:

3SAT: SAT restricted to 3 literals per clause

  • Reduction: SAT ≤_p 3SAT

CLIQUE: Does graph G have a clique of size k?

  • Reduction: 3SAT ≤_p CLIQUE

VERTEX-COVER: Does graph G have a vertex cover of size k?

  • Reduction: CLIQUE ≤_p VERTEX-COVER

HAMPATH: Does graph G have a Hamiltonian path from s to t?

  • Reduction: VERTEX-COVER ≤_p HAMPATH

SUBSET-SUM: Does set S have a subset summing to t?

  • Reduction: VERTEX-COVER ≤_p SUBSET-SUM

Thousands of NP-complete problems are known across diverse domains.

Polynomial-Time Reductions

Definition: A ≤_p B if there exists polynomial-time computable function f such that:

w ∈ A ⟺ f(w) ∈ B

Properties:

  • Transitive: If A ≤_p B and B ≤_p C, then A ≤_p C
  • Preserves NP-membership
  • If B ∈ P and A ≤_p B, then A ∈ P

Example Reduction: 3SAT ≤_p CLIQUE

3SAT: Given CNF formula φ with 3 literals per clause, is φ satisfiable?

CLIQUE: Given graph G and integer k, does G have a k-clique?

Reduction: Given 3SAT instance φ with m clauses:

  1. Create graph G:

    • For each literal in each clause, create a vertex
    • Connect two vertices if: a) They are in different clauses b) They are not negations of each other
  2. Set k = m (number of clauses)

Correctness:

  • φ is satisfiable ⟺ G has an m-clique
  • Satisfying assignment chooses one true literal per clause
  • These literals form a clique (connected because from different clauses and consistent)

Time: O(m²) to construct graph, polynomial.

Complexity Beyond NP

The Class coNP

Definition: coNP = {L̄ : L ∈ NP}

Examples:

  • UNSAT = {<φ> : φ is not satisfiable} ∈ coNP
  • NO-HAMPATH ∈ coNP

Relationship: If P = NP, then P = coNP = NP.

Open question: Is NP = coNP?

The Polynomial Hierarchy

Definition:

  • Σ₀ᴾ = Π₀ᴾ = P
  • Σₖ₊₁ᴾ = NPᐩΣₖᴾ (NP with oracle for Σₖᴾ)
  • Πₖ₊₁ᴾ = coΣₖ₊₁ᴾ

Polynomial Hierarchy: PH = ⋃ₖ Σₖᴾ

Significance: Captures problems with alternating quantifiers.

PSPACE

Definition: PSPACE = languages decidable in polynomial space.

Theorem: P ⊆ NP ⊆ PSPACE

PSPACE-complete problems:

  • TQBF = {<φ> : φ is a true quantified Boolean formula}
  • GEOGRAPHY (game on directed graph)

EXPTIME

Definition: EXPTIME = ⋃ₖ TIME(2^(nᵏ))

Theorem: PSPACE ⊆ EXPTIME

EXPTIME-complete problems:

  • Generalized chess, checkers, Go (on n×n board)

Known Separations

Theorem (Time Hierarchy): P ⊊ EXPTIME

Proof: Diagonalization (similar to Halting Problem proof)

Summary of relationships:

P ⊆ NP ⊆ PSPACE ⊆ EXPTIME

We know: P ⊊ EXPTIME
We don't know: Whether any other inclusions are strict

Exercises

Basic Exercises

  1. TM Construction: Design Turing machines for:

    • a) L = {w : w contains equal numbers of 0s and 1s}
    • b) L = {w#w : w ∈ {0,1}*}
    • c) L = {aⁿbⁿcⁿ : n ≥ 0}
  2. Decidability: Prove these languages are decidable:

    • a) A_NFA = {<N, w> : N is an NFA that accepts w}
    • b) E_REX = {<R> : R is a regex and L(R) = ∅}
    • c) {<G> : G is a CFG generating at least one palindrome}
  3. Recognizability: Show that the complement of A_TM is not Turing-recognizable.

Intermediate Exercises

  1. Undecidability via Reduction: Prove these languages are undecidable:

    • a) {<M> : M accepts at least 10 strings}
    • b) {<M₁, M₂> : L(M₁) ∩ L(M₂) ≠ ∅}
    • c) {<M> : M accepts some palindrome}
  2. Rice's Theorem Application: Use Rice's Theorem to prove undecidability of:

    • a) {<M> : L(M) is infinite}
    • b) {<M> : ε ∈ L(M)}
    • c) {<M> : L(M) is context-free}
  3. NP Membership: Prove these problems are in NP by providing polynomial-time verifiers:

    • a) HAMCYCLE = {<G> : G has a Hamiltonian cycle}
    • b) INTEGER-PROGRAMMING
    • c) GRAPH-ISOMORPHISM

Advanced Exercises

  1. Multitape Simulation: Prove formally that every k-tape TM can be simulated by a single-tape TM with O(T(n)²) time overhead.

  2. NP-Completeness Reduction: Prove SUBSET-SUM is NP-complete by reduction from VERTEX-COVER. Show:

    • a) The reduction construction
    • b) Correctness proof
    • c) Polynomial-time bound
  3. Time Hierarchy: Prove that TIME(n) ⊊ TIME(n³) using diagonalization.

  4. Complexity Class Relations: Prove or disprove:

    • a) If P = NP, then every language in NP (except ∅ and Σ*) is NP-complete
    • b) If NP = coNP, then there exists a problem in NP that is not NP-complete
  5. Oracle Machines: Define oracle Turing machines and prove that P^NP (P with NP oracle) contains NP.

  6. Savitch's Theorem: Prove that NSPACE(f(n)) ⊆ SPACE(f(n)²) for f(n) ≥ log n.

Summary

In this reading, we explored the theoretical limits of computation:

  1. Turing Machines provide a formal model of computation with unlimited memory, capturing the intuitive notion of an algorithm

  2. Variants (multitape, nondeterministic, etc.) all have the same computational power, supporting the robustness of the model

  3. Church-Turing Thesis states that Turing machines capture all effectively computable functions - a foundational assumption in computer science

  4. Decidability distinguishes problems that can be solved algorithmically (halting on all inputs) from those that cannot

  5. The Halting Problem is the canonical undecidable problem, proved using diagonalization

  6. Reductions provide a technique for proving undecidability by transforming one problem into another

  7. Rice's Theorem shows that all non-trivial semantic properties of Turing machines are undecidable

  8. Complexity Classes P and NP classify problems by computational difficulty, with P representing "efficiently solvable" problems

  9. NP-Completeness identifies the hardest problems in NP; solving one efficiently would solve all NP problems efficiently

  10. P vs. NP remains the most important open problem in computer science

Key Insights:

  • Not all problems are computable (undecidability)
  • Among computable problems, some are inherently harder than others (complexity)
  • Nondeterminism appears to add power for time complexity (P vs. NP) but not for decidability
  • Reductions are a powerful tool for both undecidability and complexity results

Practical Implications:

  • Some programming tasks are impossible in principle (not just hard)
  • NP-complete problems require careful algorithm design and may need approximation
  • Understanding computational limits guides problem-solving strategies

Historical Context:

  • Turing (1936): Turing machines and undecidability
  • Church (1936): Lambda calculus
  • Gödel (1931): Incompleteness theorems
  • Cook (1971): NP-completeness
  • Karp (1972): 21 NP-complete problems

Conclusion

The theory of computation provides the mathematical foundation for all of computer science. From finite automata recognizing simple patterns to Turing machines capturing all computable functions, these models reveal both the power and limitations of computation.

Key Takeaways:

  1. Automata Theory Hierarchy:

    • Regular Languages ⊊ Context-Free Languages ⊊ Decidable Languages ⊊ Recognizable Languages
    • Each level requires more computational power
  2. Fundamental Limits:

    • Some problems cannot be solved by any algorithm (Halting Problem)
    • Some problems can be solved but require exponential time
    • The boundary between tractable and intractable is captured by P vs. NP
  3. Practical Impact:

    • Compiler design relies on regular and context-free languages
    • Cryptography depends on presumed hardness of certain problems
    • Algorithm design is guided by complexity theory
  4. Open Questions:

    • P vs. NP remains unsolved
    • Many other complexity class separations are unknown
    • Theory continues to evolve

The journey from finite automata to Turing machines mirrors the historical development of computer science and provides essential tools for reasoning about computation. Whether designing parsers, analyzing algorithms, or proving impossibility results, the theory of computation remains indispensable for computer scientists.

← Previous: Context-Free Grammars

References and Further Reading

  1. Sipser, Michael. "Introduction to the Theory of Computation" (Chapters 3-7)
  2. Hopcroft, John E., et al. "Introduction to Automata Theory" (Chapters 8-9)
  3. Turing, Alan. "On Computable Numbers" (1936)
  4. Cook, Stephen. "The Complexity of Theorem-Proving Procedures" (1971)
  5. Karp, Richard. "Reducibility Among Combinatorial Problems" (1972)
  6. Garey, Michael R., and David S. Johnson. "Computers and Intractability: A Guide to the Theory of NP-Completeness" (1979)
  7. Arora, Sanjeev, and Boaz Barak. "Computational Complexity: A Modern Approach" (2009)
  8. Papadimitriou, Christos. "Computational Complexity" (1994)