Computation & CS Foundations — Final Lecture
Algorithms, Automata, Hardness, and the Law of Finite Symbolic Power
Computation = symbolic state updated by explicit rules within finite resources in a physical world.
0. Orientation: Computation as Constraint, Not Convenience
Computation here is not “apps”, “software”, or “industry.” It is symbolic state under explicit update rules inside finite resources.
Three layers:
- Formal — abstract machines, grammars, programs.
- Complexity-theoretic — time, space, randomness, communication.
- Physical — energy, noise, latency, faults.
Core lineage (stacked): Turing/Church define effective procedure; von Neumann binds it to hardware architecture; Knuth quantifies algorithmics; Dijkstra/Hoare turn programs into contracts and proofs; Sipser canonizes automata/computability/complexity; Papadimitriou exposes structural complexity and reductions.
Fast entry sequence: formal → complexity → physical → distributed
1. Models of Computation and the Church–Turing Boundary
1.1 Equivalent formal models
Three foundational characterizations of “effective procedure”:
- Turing machines — finite control + tape + head; stepwise read/write/move/state update.
- λ-calculus — terms + β-reduction; computation as substitution/function application.
- (Partial) recursive functions — closure under composition/recursion/minimization.
A function is effectively calculable by any mechanical method iff it is TM-computable (equivalently λ-definable / partial recursive).
Not a theorem — it links an informal notion (“effective method”) to a formal one.
Variants extend the claim from computability to efficiency (polynomial simulation) and from abstract procedures to physically realizable devices.
2. Computability: Decidable, Enumerable, Undecidable
2.1 Recursive, r.e., co-r.e.
- Decidable / recursive: halts on all inputs and answers correctly.
- r.e. / semi-decidable: halts and accepts on members; may run forever on non-members.
- co-r.e.: complements of r.e. languages.
2.2 Halting problem
Given a program P and input x: will P(x) halt? No total algorithm decides this for all pairs.
2.3 Rice’s theorem (semantic undecidability)
Any non-trivial semantic property of programs is undecidable in full generality. Static analyzers must trade completeness, soundness, and termination: general semantic perfection is blocked.
3. Algorithms: Procedure, Correctness, Resource
3.1 Total vs partial
Explicitly track termination: some “procedures” are partial (may not halt), and the difference matters everywhere computability bites.
3.2 Programs as logic (Dijkstra & Hoare)
- Hoare logic: {P} C {Q} — if P holds and C terminates, then Q holds.
- Weakest precondition: wp(C,Q) — the weakest condition guaranteeing postcondition Q after C.
3.3 Time and space
Complexity classes arise from resource bounds (P, PSPACE, EXP, …). Hierarchy theorems guarantee strict stratification as bounds increase.
4. Data Structures: Architecture of State and Access
Data structures encode what is cheap vs expensive, and which invariants define “valid state.” Any real registry is a design choice over structure + invariants.
Canonical toolbox (what changes the cost model)
- Arrays + amortization; linked structures; stacks/queues/deques.
- Balanced trees, heaps; B-trees/B+ trees (cache/disk reality).
- Graphs (relations); hash tables (expected O(1) under assumptions).
5. Automata and Formal Languages: The Chomsky Hierarchy
Machine power is memory power:
- FA/NFA → regular languages.
- PDA → context-free languages (NPDA = CFL; DPDA is a strict subset).
- LBA → context-sensitive languages.
- TM → r.e. languages.
6. Complexity Theory: Cost and Intractability
6.1 Classes and containment
P ⊆ NP ∩ co-NP ⊆ PSPACE ⊆ EXPTIME ⊆ EXPSPACE (some strictness known; many frontiers open).
6.2 P vs NP and NP-completeness
NP-completeness organizes hardness via reductions: SAT is NP-complete; thousands of problems inherit the same worst-case wall unless P=NP.
NP-hardness does not forbid heuristics or structure exploitation; it denies a universal worst-case polynomial-time exact solver for NP-complete families (unless P=NP).
7. Randomized Computation
Randomness changes the model: bounded error (BPP), one-sided error (RP/co-RP), zero-error expected time (ZPP). Guarantees can become statistical rather than absolute.
8. Algorithmic Information: Kolmogorov Complexity
K(x) = length of the shortest program outputting x on a fixed universal TM. K measures compressibility; algorithmic randomness is incompressibility.
9. Physical Limits: Energy and Communication
9.1 Landauer (erasure costs energy)
Irreversible erasure implies a minimum heat dissipation per bit: k_B T ln 2.
9.2 Communication complexity (Yao)
Distributed coordination has hard lower bounds on bits exchanged, independent of local compute power.
10. Distributed Limits: Consensus and Tradeoffs
- FLP impossibility: no deterministic consensus with faults in a fully asynchronous system (agreement + termination cannot be guaranteed in all executions).
- CAP: under partitions, cannot simultaneously guarantee strong consistency and full availability.
Guarantees are traded against failure models and time assumptions (randomness, partial synchrony, weakened consistency).
11. Cryptographic Hardness and Asymmetry
Modern cryptography exploits assumed hardness to create effort asymmetries:
- easy to verify, hard to forge;
- easy to check, hard to invert;
- easy to validate commitments, hard to violate them without keys.
Hardness is not just constraint — it is a resource used to build unforgeability and irreversibility into protocols.
12. The Thinkers as One Architecture
- Church & Turing: formalize procedure; expose undecidable limits.
- von Neumann: stored-program architecture; abstract computation becomes hardware organization.
- Knuth: quantitative algorithmics and data structures; analysis becomes discipline.
- Dijkstra & Hoare: programs as proofs and contracts; concurrency as formal interaction.
- Sipser: canonical pipeline (automata → computability → complexity → NP-completeness).
- Papadimitriou: structural complexity (reductions/web of classes), games, modern completeness notions.
13. Synthesis: The Law of Finite Symbolic Power
- Computability: halting/Rice block universal semantic decision.
- Complexity: strong evidence that NP-hard exact global optimization is hostile in worst case (unless P=NP); hierarchies never end.
- Randomness: bounded-error computation is native (BPP/RP/ZPP); guarantees can be statistical.
- Information: Kolmogorov complexity is uncomputable; perfect noise/structure separation is impossible.
- Physical: irreversible erasure costs energy (Landauer); communication has lower bounds (Yao).
- Distributed: deterministic consensus and perfect CAP-style guarantees are blocked under realistic failures.
- Hardness-as-resource: cryptography exploits asymmetry of effort to enforce unforgeability and irreversible commitments.
This is the unified picture: computation is a layered lattice of impossibilities and tradeoffs. Any system—algorithmic, institutional, cryptographic, or distributed—lives inside these walls.
Resource Index (All Links)
Indexed as r1…r33. Inline chips jump here.
A. Core full courses
r1 — MIT 6.006: Introduction to Algorithms (Spring 2020)
Algorithms + data structures: asymptotics, hashing, graphs, DP; clean cost models.
r2 — MIT 18.404J: Theory of Computation (Sipser, Fall 2020)
Automata → computability → complexity → NP-completeness (live canonical pipeline).
r3 — Boaz Barak: Introduction to Theoretical Computer Science (free text)
Modern complement: complexity, randomness, cryptographic primitives, and more.
r4 — Princeton: Algorithms (Sedgewick & Wayne) ecosystem
Robust implementations + lectures/visualizations for algorithms and data structures.
B. Core textbooks / monographs
r5 — Sipser: Introduction to the Theory of Computation (PDF link)
Canonical ToC: automata, decidability, reductions, complexity, NP-completeness.
r6 — Hopcroft–Motwani–Ullman: Automata, Languages, Computation (PDF link)
Classic automata/formal languages text; sharp on machine models and grammars.
r7 — Maheshwari & Smid: Introduction to Theory of Computation (free)
Second view of core ToC with worked examples; strong reference companion.
r8 — Knuth: The Art of Computer Programming (TAOCP)
Deep algorithmics: combinatorics, analysis, and unforgiving rigor on fundamentals.
r9 — Sedgewick & Flajolet: An Introduction to the Analysis of Algorithms (PDF link)
Average-case tools and analytic methods bridging combinatorics and algorithm cost.
r10 — Arora & Barak: Computational Complexity (Modern Approach)
Graduate-level complexity bible: PH, IP/PCP, circuits, derandomization, quantum complexity.
r11 — Li & Vitányi: Kolmogorov Complexity and Its Applications
Definitive reference on algorithmic information and randomness.
r12 — Garey & Johnson: Computers and Intractability (NP-completeness handbook)
Classic reductions + the legendary NP-complete problem catalog.
C. Foundational papers (“law texts”)
r13 — Turing (1936): On Computable Numbers (TM + undecidability)
Defines Turing machines; proves undecidability of Entscheidungsproblem via halting-style argument.
r14 — Church (1936): Unsolvable Problem of Elementary Number Theory
λ-calculus lineage; undecidability counterpart to Turing’s model.
r15 — von Neumann: First Draft of a Report on the EDVAC
Stored-program architecture as structural spec for classical computers.
r16 — Hoare (1978): Communicating Sequential Processes (CSP)
Message-passing concurrency as disciplined interaction calculus.
r17 — Dijkstra: “The threats to computing science” (EWD898)
Philosophical anchor: discipline vs fashion/tool-worship; computing as rigorous science.
r18 — Hoare: Communicating Sequential Processes (book-length development)
Extended process algebra and semantics for concurrent systems.
r19 — Aaronson: “NP-complete Problems and Physical Reality”
Systematic audit of “physics will solve NP” proposals; binds complexity to physical constraints.
D. High-signal lectures & talks
r20 — MIT 6.006 Lecture 1: Algorithms and Computation
Opening move: cost models, algorithmic thinking, formal performance metrics.
r21 — MIT 18.404J Lecture 1: Finite Automata, Regular Expressions
Clean start to the language hierarchy and closure reasoning.
r22 — Papadimitriou: The Story of Complexity
Reductions, classes, why P vs NP is structurally central.
r23 — Papadimitriou: Theory of Computation I (Simons Institute)
Dense stitch: core ToC concepts integrated beyond standard textbook pacing.
r24 — Boaz Barak: The Quest of an Optimal Algorithm
Modern theorist lens: complexity, optimization, cryptography, and limits of knowledge.
E. Podcasts / long-form audio
r25 — Mindscape #99: Aaronson on complexity, computation, quantum gravity
Wide-ranging: P vs NP, quantum computing, complexity as a physics lens.
r26 — 632nm: “Why is P vs NP so hard?” (Aaronson)
Focused deep dive on the hardness of the central question and how we reason at the boundary.
r27 — Random Walks: Boaz Barak on complexity and cryptography
How a top theorist frames complexity and theoretical work.
r28 — Quanta: “How Does Math Keep Secrets?”
Cryptography as complexity-in-action: secrecy anchored in hardness assumptions.
F. Documentary / historical context
r29 — The Machine That Changed the World (WGBH/BBC)
Serious history of computing lineage (Turing/von Neumann era through networks and beyond).
G. Expository articles & modern structural complexity
r30 — YC interview: Aaronson on complexity and quantum computers
Accessible framing of P vs NP and why the question matters.
r31 — Quanta: “How a Problem About Pigeons Powers Complexity Theory”
PPAD-style complexity beyond NP; fixed points, equilibria, and structural hardness.
r32 — Plus Maths: “Elusive Equilibria” (PPAD bridge)
Fixed-point theorems, equilibria, and Papadimitriou-linked complexity classes.
r33 — Vitányi: Kolmogorov Complexity page (resource hub)
Compact map to Kolmogorov complexity references and entry points.