I’ll be posting here my solutions to Introduction to Algorithms, to keep myself motivated, and force myself to keep solutions semi-legible.

Notes:

Master Theorem

(Theorem 4.1 of CLRS)

For \(a \ge 1, b > 1\), \(f(n)\) a function, let \(T(n)\) be defined by

\[ T(N) = a T(n / b) + f(n). \]

(When we note that we have \(a\) subproblems of a reduction in size of \(b\), and \(f(n)\) work to do at each step.)

Then

\[ T(n) = \begin{cases} \Theta(n^{\log_b a}) & \text{if } f(n) = O(n^{\log_b a - \epsilon}), \text{ some } \epsilon > 0,\\ \Theta(n^{\log_b a}\lg n) & \text{if } f(n) = \Theta(n^{\log_b a}), \\ \Theta(f(n)) & \text{if } f(n) = \Omega(n^{\log_b a + \epsilon}), \text{ some } \epsilon > 0.\\ \end{cases} \]

The last actually additionally requires that there is some \(c < 1\) such that \(a\cdot f(n / b) \le c\cdot f(n)\), for \(n \ge N\), some \(N\).

Some of the derivation steps may be useful for other problems.

For example, the height of the diagram is \(\log_b n\), and that the width of the bottom is thus \(a^{\log_b n} = n^{\log_b a}\). That is roughly how they produce lemma 4.2,

\[ T(n) = \Theta(n^{\log_b a}) + \sum_{j = 0}^{\log_b{n - 1}} a^j f(n/b^j). \]

divide and conquer

\[ T(n) = aT(n/b) + D(n) + C(n). \]

(\(a\) subproblems, \(b\) is reduction in size, \(D(n)\) time to divide the problem, \(C(n)\) time to combine the solutions.)

backtracking

this is where you have functions to check the first child, and to check the next sibling of child, checking if it results in a candidate to accept or reject. if you descend fully and accept, then you can return as output (can return multiple outputs). if you check all children, then you backtrace to parent and call for next sibling.

the N-queen’s problem seems to be a standard example.

other notes

general strategies - divide and conquer - backtracking (typically exponential, but wikibooks describes as a stepping stone to other concepts) - backtracking is pre-order DFS - dynamic programming

trees - avl and rb trees have O(lg n) time for lookup, insert, delete. - restoring the avl/rb property after insert or delete takes O(1) amortized time for rb trees. - for a lot of lookups, user avl - for a lot of inserts, use rb

graphs - djikstra’s algorithm for shortest path may be similar to greedy algorithm - breadth-first search can be viewed as a special case of dijkstra’s

tree traversal

  • depth first search
    • stack, recursive
    • in-order, pre-order, post-order
  • breadth-first search
    • queue, corecursive
    • (level-order)

sorting

heapsort sorts in place

insertion sort

Initialize \(i = 1\), \(x = A[i]\) and \(j = i\), and if \(A[j] > A[i]\), decrement \(j\), set \(A[j] = A[j + 1]\), until check fails or \(j < 0\), then set \(A[j] = x\).

\(T(n) = O(n^2).\)

merge sort

MERGE(A, p, q, r)

make new arrays L and R based on p to q and q to r, and merge them by comparing the first of the next elements of L and R at each step.

MERGE-SORT(A, p, r)

as long as p < r, set \(q = \lfloor (p + r) / 2 \rfloor\), apply MERGE-SORT recursively on each half, and then call MERGE on the result.

\(T(n) = O(n \lg n).\)

heap sort.

MAX-HEAPIFY(A, i)

compare a node to the left and right child, pick the largest, and swap. If there was a swap, recurse on that child.

Recurrence is apparently \(T(n) \le T(2n / 3) + \Theta(1)\), which gives \(T(n) = O(\lg n)\), or for height \(h\), \(O(h)\).

BUILD-MAX-HEAP(A)

start from \(\lfloor A.length/2 \rfloor\) (last node with a left child), and working to \(1\), call MAX-HEAPIFY(A, i).

They show this has time \(O(n)\).

HEAPSORT(A)

first calls BUILD-MAX-HEAP(A), then starting from the tail of A, exchanges element with A(1) (largest in heap), declares heapsize to be one less, and calls MAX-HEAPIFY(A, 1). this continues until heapsize is 1, that is, to index = 2.

\(T(n) = O(n \lg n).\)