Algorithmic Design Techniques
- Incremental technique 小问题转大问题
- Recursive technique 大问题转小问题
Algorithmic Analysis
Correctness proof
- $\forall x\in X$, the algorithm produces the correct output
- Correctness on many instances are insufficient
Efficiency
$f(n)$ is the true growth rate of complexity
Big-$O$ notation:
- $f(n)\in O(g(n))$ if there exists a $c>0$, so that $f(n)\le cg(n)$, for large enough $n$ ($n\ge n_0$)
- E.g., an algorithm is $O(n^3)$ if there is a constant $c$ so that $c\cdot n^3 \ge f(n)$
Big-$\Theta$ notation:
- $f(n)\in \Theta(g(n))$ if there exists a $c,c'>0$, so that $c'g(n)\le f(n)\le cg(n)$, for large enough $n$ ($n\ge n_0$)
Big-$\Omega$ notation
- $f(n)\in \Omega(g(n))$ if there exists a $c'>0$, so that $f(n)\le c'g(n)$, for large enough $n$ ($n\ge n_0$)



<aside>
💡 Exam tips:
- Write down the O() for everyline
- Then add all up to get the final O()

For the nth line of the code:
- Write down the constant factor ($c_n$) and the frequency ($f_n$) of execution
- Execution time = $c_n\cdot f_n$
- E.g., $c_{10}\cdot n$
Total efficiency:
- $\sum_i c_i\cdot f_i$
</aside>
Example of Asymptotic Running Time Analysis
Selection-Sort(Array A, Integer n)
1. for integer i ← 1 to n-1 do
2. k ← i
3. for integer j ← i+1 to n do
4. if A[j] < A[k] then
5. k ← j
6. end if
7. end for
8. swap A[i] and A[k]
9. end for
- Line 1: $O(n)$
- Line 2: $O(n)$
- Line 3: Considering its decreasing runs due to i increment: $\Sigma_{i=1}^{n-1} (n-i) = \frac{n(n-1)}{2} = O(n^2)$
- Lines 4 & 5: Same as Line 3 $O(n^2)$
- Line 8: $O(n)$
Aggregated Time Complexity:
$O(n) + O(n) + O(n^2) + O(n^2) + O(n) = O(n^2)$
Thus, the overall time complexity is $O(n^2)$