ABSTRACT
This repository contains a structured study of discrete algorithms, focusing on their design, formal verification via induction/loop invariants, and asymptotic efficiency.
Foundational Toolkits
Before diving into specific algorithms, these core tools provide the mathematical and procedural framework for analysis:
- Sum of an Arithmetic Series: The essential mathematical tool for deriving closed-form expressions for nested loops (like Bubble and Selection Sort). It explains why results in .
- Algorithm Checklist: Your step-by-step “source of truth” for analyzing any new algorithm. It ensures you never miss a step—from defining the input/output to proving correctness and extracting the recurrence.
Knowledge Map
Algorithm Analysis
- Focuses on the foundational tools and notation used to measure the “goodness” and efficiency of code.
- Covers Asymptotic Notation () for growth rates, Loop Invariants for iterative correctness, and general Time Analysis.
- Essential for establishing a rigorous mathematical baseline to compare different algorithmic approaches.
- Provides the framework for understanding how resource consumption scales with input size.
Recursive Algorithms
- Explores the transition from simple iterative loops to self-referential logic and complex problem decomposition.
- Includes specialized sub-directories for Divide and Conquer paradigms and Solving Recurrence Closed Form (via Master Theorem, Unraveling, or HRRCC).
- Covers Recursive Proofs and mathematical approximations like Stirling’s Approximation for factorial growth.
- Critical for solving problems that exhibit optimal substructure and overlapping subproblems.
Searching
- A systematic study of finding specific elements within data sets under varying constraints.
- Contrasts Linear Search () with the optimized Binary Search () to demonstrate the power of data organization.
- Explores the Lower Bounds for Searching, providing the theoretical proof that comparison-based searching cannot beat logarithmic time.
- Serves as a practical application of how data properties (like being sorted) drastically change algorithmic complexity.
Sorting
- Focuses on the systematic organization of data to enable faster downstream operations.
- Categorizes methods into Iterative approaches (Bubble, Selection, and Insertion Sort) and links to recursive strategies like Merge Sort.
- Important for understanding the trade-offs between algorithm simplicity, memory overhead, and computational speed.
- Provides the operational logic for optimizing data retrieval and processing pipelines.
Quick Complexity Reference
| Class | Algorithm Example | Growth Rate |
|---|---|---|
| Logarithmic | Binary Search | |
| Linear | Linear Search | |
| Log-Linear | Merge Sort | |
| Quadratic | Selection Sort | |
| Sub-Exponential | Karatsuba Multiplication |
Related Toolkits
- Counting: Combinatorial foundations used to determine the size of search spaces and the number of possible permutations.
- Probability: Essential for analyzing randomized algorithms and determining average-case complexity.