Time & Space Complexity
Concept
Time complexity measures how the runtime of an algorithm scales with input size n. Space complexity measures memory usage. We use Big-O notation to describe the upper bound growth rate, ignoring constants and lower-order terms. This lets us compare algorithms and predict if they'll pass within time limits.
When to Use
- ▸Choosing between multiple algorithms for a problem
- ▸Estimating if your solution will pass (typically ~10^8 operations/second)
- ▸Identifying bottlenecks in your code
- ▸Deciding between brute force and optimized approaches
Intuition
Think of n as the input size. O(1) means constant — the same speed regardless of n. O(n) means linear — double the input, double the time. O(n^2) means quadratic — double the input, quadruple the time. O(log n) means you halve the problem each step (like binary search). In contests, if n = 10^5, an O(n^2) solution does 10^10 operations — way too slow. You need O(n log n) or better.
Complexity
Time
N/A (this is the concept itself)
Space
N/A
Template Code
cpp
// Common complexities and their rough limits:
//
// O(1) — constant — any n
// O(log n) — binary search — any n
// O(n) — single loop — n <= 10^8
// O(n log n) — sorting, merge sort — n <= 10^6
// O(n^2) — nested loops — n <= 5000
// O(n^3) — triple loops — n <= 500
// O(2^n) — subsets — n <= 20
// O(n!) — permutations — n <= 10
// Rule of thumb: ~10^8 operations per second
// Examples:
// -------------------------------------------------
// O(n) — Linear scan
for(int i = 0; i < n; i++){
// process element
}
// O(n^2) — All pairs
for(int i = 0; i < n; i++){
for(int j = 0; j < n; j++){
// process pair (i, j)
}
}
// O(n log n) — Sorting
sort(v.begin(), v.end());
// O(log n) — Binary search
while(l <= r){
int mid = l + (r - l)/2;
if(v[mid] == target) break;
else if(v[mid] < target) l = mid + 1;
else r = mid - 1;
}
// O(V + E) — Graph BFS/DFS
// O((V + E) log V) — Dijkstra with priority queue
// O(V * E) — Bellman-Ford
// O(V^3) — Floyd-Warshall
// O(n * W) — Knapsack DP