Introduction: Why Algorithms Matter
This chapter explains what algorithms are, why you should care, and walks you through measuring your first one.
The Short Version
An algorithm is a sequence of steps for solving a problem. A data structure is how you organize information so an algorithm can work on it efficiently. Together they decide whether your code runs in 10 milliseconds or 10 hours.
You don't need algorithms to write software. You do need them to write software that:
- Doesn't time out on real data.
- Passes a technical interview.
- Fits into memory.
- Makes sense to someone else reading it.
Most "bad performance" bugs are algorithm bugs, not language bugs. The Python list comprehension isn't slow; the nested loop is.
Why Care
Three concrete reasons.
Correct Ideas Scale
A hash table lookup stays fast whether you have 100 items or 100 million. A nested for-loop that works on 100 items turns your server into a space heater at 100 million.
The Interview Loop
Almost every technical interview asks algorithmic questions. You can argue about whether that's fair; the market has decided. Solid algorithm fundamentals make this tractable.
Reading Other People's Code
Git's bisect, Postgres's query planner, Redis's sorted sets, React's reconciler: they're all algorithms applied to specific domains. Recognizing the pattern helps you understand software you didn't write.
What Isn't an Algorithm (for This Tutorial)
- Not theorems (read CLRS for proofs).
- Not a specific programming language feature.
- Not a style guide.
- Not competitive programming's tricks of the week.
This tutorial teaches the algorithms you reach for in daily engineering and on interview whiteboards. Python is the language because it's compact enough to see the algorithm through the syntax.
Your First Algorithm (Linear Search)
Find a value in a list.
def find(items: list[int], target: int) -> int:
for i, item in enumerate(items):
if item == target:
return i
return -1
print(find([5, 3, 7, 1], 7)) # 2
print(find([5, 3, 7, 1], 42)) # -1
Two things to notice:
- In the worst case, we scan every element.
- In the best case (target is first), we return after one step.
The worst case is what we analyze: the inputs we don't control.
Measuring Cost
Reason about cost before writing. Measure to confirm.
import timeit
items = list(range(1_000_000))
t1 = timeit.timeit(lambda: 999_999 in items, number=100)
print(f"'in' on list: {t1:.3f}s")
s = set(items)
t2 = timeit.timeit(lambda: 999_999 in s, number=100)
print(f"'in' on set: {t2:.5f}s")
Sample output:
'in' on list: 0.890s
'in' on set: 0.00004s
Same question, two data structures, enormous speed difference. The list scans every item; the set hashes and looks up in (effectively) one step. Chapter 3 covers hash tables properly.
This isn't a Python trick. It's the same in every language. The data structure you chose determines what costs what.
The Plan
Fundamentals (Chapters 1 to 2):
- What algorithms are, why they matter.
- Big-O: how to estimate cost without running the code.
Core (Chapters 3 to 8):
- The data structures you'll use every day.
- The algorithms that operate on them: searching, sorting, tree traversal, graph walks.
Advanced (Chapters 9 to 10):
- Dynamic programming: turning exponential problems into polynomial ones.
- Greedy and divide-and-conquer: when local decisions give global answers.
Ecosystem (Chapter 11):
- The recurring patterns (two pointers, sliding window, backtracking) that solve a surprising fraction of all problems.
Mastery (Chapter 12):
- How to approach a new problem.
- Interview habits.
- Anti-patterns.
How to Get the Most Out of This
- Type the code. Reading is not learning. Typing is.
- Do problems. For each chapter, solve 3 to 5 related LeetCode problems.
- Don't rush. An algorithm understood is better than ten copy-pasted.
- Teach it back. Explaining a solution to a colleague (or a rubber duck) reveals the holes.
The tutorial is short. The practice is long. That's the deal.
Common Pitfalls
"I'll just Google the algorithm when I need it." Works when you know what to search for. Understanding the patterns tells you what to Google.
"Python's built-ins are fast enough." Usually, yes. Until they aren't, and you don't know why. Understanding the underlying algorithm is the difference.
"Big-O is academic; real systems have constants that matter." Both are true. Constants matter inside a class; classes matter across classes. O(n log n) with big constants still beats O(n^2) at scale.
"I know enough after one chapter on Big-O." You don't. Go do problems. Concepts land by contact.
"Learning Python equals learning algorithms." Python is the language. Algorithms are the ideas. Learn both.
Next Steps
Continue to 02-big-o.md to measure what your code actually costs.