Computational Complexity

Measuring what can be computed, and at what cost.

Big-O Notation

Definition

\(f(n) = O(g(n))\) if there exist constants c and \(n_0\) such that:

\[f(n) \leq c \cdot g(n) \quad \forall n \geq n_0\]

Common Classes

Complexity Name Example

\(O(1)\)

Constant

Array index access

\(O(\log n)\)

Logarithmic

Binary search

\(O(n)\)

Linear

Linear search

\(O(n \log n)\)

Linearithmic

Merge sort, FFT

\(O(n^2)\)

Quadratic

Bubble sort, naive matrix multiply

\(O(n^3)\)

Cubic

Standard matrix multiply

\(O(2^n)\)

Exponential

Subset enumeration

\(O(n!)\)

Factorial

Permutation enumeration

Growth Rates

For n = 1,000,000:

\(O(\log n)\)

~20 operations

\(O(n)\)

1,000,000 operations

\(O(n \log n)\)

~20,000,000 operations

\(O(n^2)\)

1,000,000,000,000 operations

Big-Omega (Lower Bound)

\(f(n) = \Omega(g(n))\) if \(f(n) \geq c \cdot g(n)\) for large n.

Big-Theta (Tight Bound)

\(f(n) = \Theta(g(n))\) if \(f(n) = O(g(n))\) and \(f(n) = \Omega(g(n))\).

Complexity Classes

P (Polynomial Time)

Problems solvable in polynomial time by a deterministic Turing machine.

Examples: Sorting, shortest path, linear programming.

NP (Nondeterministic Polynomial)

Problems whose solutions can be verified in polynomial time.

Examples: SAT, graph coloring, traveling salesman (decision version).

NP-Complete

Problems that are:

  1. In NP

  2. Every NP problem reduces to it in polynomial time

If any NP-complete problem has a polynomial solution, then P = NP.

NP-Hard

At least as hard as NP-complete, but not necessarily in NP.

Examples: Halting problem, optimal scheduling.

The P vs NP Problem

\[P \stackrel{?}{=} NP\]
  • If P = NP: Efficient algorithms exist for all verifiable problems

  • If P ≠ NP: Some problems are fundamentally hard to solve

Millennium Prize Problem ($1,000,000).

Cryptographic security assumes P ≠ NP.

Master Theorem

For recurrences of the form:

\[T(n) = aT\left(\frac{n}{b}\right) + f(n)\]

Let \(c = \log_b a\):

Condition Result

\(f(n) = O(n^{c-\epsilon})\)

\(T(n) = \Theta(n^c)\)

\(f(n) = \Theta(n^c)\)

\(T(n) = \Theta(n^c \log n)\)

\(f(n) = \Omega(n^{c+\epsilon})\)

\(T(n) = \Theta(f(n))\)

Example: Merge Sort

\[T(n) = 2T\left(\frac{n}{2}\right) + O(n)\]

Here \(a=2, b=2, c=1\), and \(f(n) = O(n) = O(n^c)\).

By case 2: \(T(n) = \Theta(n \log n)\).

Amortized Analysis

Average time per operation over a sequence of operations.

Example: Dynamic Array

  • Single append: \(O(n)\) worst case (reallocation)

  • n appends: \(O(n)\) total

  • Amortized: \(O(1)\) per append

Space Complexity

Memory usage follows same notation.

Complexity Example Implication

\(O(1)\)

In-place sort (heapsort)

Fixed memory

\(O(\log n)\)

Recursive binary search stack

Small overhead

\(O(n)\)

Merge sort auxiliary array

Scales with input

\(O(n^2)\)

Naive DP table

May be prohibitive