Growth & Complexity
Big-O is not about speed. It is about growth.
If your input grows from 10 to 1,000,000 β does your solution scale gently, linearly, or explode?
π§ Why Complexity Feels Intimidating
Because we think in small numbers.
When n = 10, everything feels fast. When n = 1,000,000, the same algorithm can collapse.
Complexity is not about how fast something runs today. It is about how it behaves under pressure.
When you first learn Big-O, it feels abstract. It feels like math. But it is simply asking: What happens when the input grows?
π§ What Are We Measuring?
β± Time Complexity
Time does not mean seconds. It means how many operations your algorithm performs.
Every loop iteration, comparison, or function call is work.
If input size doubles, does the amount of work double? Quadruple? Stay the same?
for (let i = 0; i < n; i++) {
console.log(i)
}If n = 10 β 10 operations. If n = 1,000 β 1,000 operations.
Time grows with n.
πΎ Space Complexity
Space measures additional memory used by the algorithm.
Not the input itself β but extra memory created while solving the problem.
function copy(arr) {
const result = []
for (let i = 0; i < arr.length; i++) {
result.push(arr[i])
}
return result
}This uses extra memory proportional to n.
Space: O(n)
function sum(arr) {
let total = 0
for (let i = 0; i < arr.length; i++) {
total += arr[i]
}
return total
}Here we only use one variable.
Space: O(1)
O(1) β Constant
π What Is Actually Happening?
The cost does not depend on input size.
If n = 10 β 1 step. If n = 1,000 β 1 step. If n = 1,000,000 β still 1 step.
const first = arr[0]
Time: O(1)
Space: O(1)
Frontend example:
const user = userMap[id]
Direct lookup. No scanning.
O(log n) β Logarithmic
π Structural Insight
We eliminate half the problem each step.
If n = 1,000,000:
- Step 1 β 500,000
- Step 2 β 250,000
- Step 3 β 125,000
That is the power of halving the problem repeatedly.
function binarySearch(arr: number[], target: number): number {
let left = 0
let right = arr.length - 1
while (left <= right) {
const mid = Math.floor((left + right) / 2)
if (arr[mid] === target) return mid
if (arr[mid] < target) left = mid + 1
else right = mid - 1
}
return -1
}Time: O(log n)
If you can cut the search space, you escape linear growth.
O(n) β Linear
π What Is Actually Happening?
We visit every element once.
If n = 10 β 10 steps. If n = 1,000 β 1,000 steps. If n = 1,000,000 β 1,000,000 steps.
for (const item of arr) {
console.log(item)
}Time: O(n)
Linear growth feels safe β until the input becomes massive.
Frontend example: Rendering 50,000 DOM nodes without virtualization.
O(n log n)
Slightly worse than linear, but still manageable at scale.
Common in efficient sorting algorithms.
If n = 1,000 β about 10,000 operations. More than linear, but far from quadratic.
O(nΒ²) β Quadratic
π§ Why This Explodes
We compare everything against everything.
If n = 10 β 100 operations. If n = 100 β 10,000 operations. If n = 1,000 β 1,000,000 operations.
for (let i = 0; i < arr.length; i++) {
for (let j = 0; j < arr.length; j++) {
console.log(arr[i], arr[j])
}
}Time: O(nΒ²)
Quadratic growth collapses quickly under scale.
O(2βΏ) β Exponential
Recursive branching without eliminating repetition.
function fib(n: number): number {
if (n <= 1) return n
return fib(n - 1) + fib(n - 2)
}Time: O(2βΏ)
Even small inputs become impossible very fast.
βοΈ Worst vs Average Case
Some structures behave differently depending on input.
- HashMap average: O(1)
- HashMap worst case: O(n)
Mature engineers consider both.
π Amortized Analysis
arr.push(value)
Occasionally expensive. Cheap on average.
Not every worst-case reflects real-world behavior.
π Time vs Space Trade-Off
// O(nΒ²) for i for j // O(n) using memory const map = new Map()
We often reduce time by increasing memory usage.
π§ Panic Reset Strategy
- Am I scanning more than once?
- Can I eliminate repetition?
- Can I reduce the search space?
- Can I trade memory for speed?
π Comparing Growth Visually
Letβs compare how these grow as n increases:
- O(1) β stays flat
- O(log n) β grows slowly
- O(n) β grows steadily
- O(n log n) β slightly faster than linear
- O(nΒ²) β explodes
- O(2βΏ) β becomes impossible very fast
If n = 1,000:
- O(1) β 1 step
- O(log n) β ~10 steps
- O(n) β 1,000 steps
- O(nΒ²) β 1,000,000 steps
This is why growth matters more than raw speed.
π§ Mental Model Summary
- Big-O measures growth, not speed.
- Small inputs hide bad complexity.
- Cutting the problem space changes growth.
- Repeated scanning creates explosions.
- Good engineers think at scale.
Final Insight
Big-O does not measure speed.
It measures growth under scale.
When you understand growth, you stop fearing scale.