3.1-1

Let \(f(n)\) and \(g(n)\) be asymptotically nonnegative functions. Using the basic definition of \(\Theta\)-notation, prove that \(\max(f(n), g(n)) = \Theta(f(n) + g(n)).\)

Sol’n

Indeed, since

\[ \frac{f(n) + g(n)}{2} \le \max(f(n), g(n)), \]

and

\[ \max(f(n), g(n)) \le f(n) + g(n). \]

3.1-2

Show that for any real constants \(a\) and \(b\), where \(b > 0\),

\[ (n + a)^b = \Theta(n^b). \]

Sol’n

We want to show that there are \(c_1, c_2, N\), such that for any \(n \ge N,\)

\[ 0 \le c_1 n^b \le (n + a)^b \le c_2 n^b. \]

To prove this, we first consider the fact that

\[ \lim_{n\to\infty}\left(\frac{n + a}{n}\right)^b = 1. \]

(This follows quickly using limit laws and that \(\lim_{n\to\infty} 1/n = 0.)\)

Then by definition of limit, we have that for any \(\varepsilon > 0,\) that there is some \(N\) such that for any \(n \ge N,\)

\[ \left|\left(\frac{n + a}{n}\right)^b - 1\right| \le \varepsilon. \]

Choosing \(\varepsilon = 1/2,\) we have, as desired, some \(N\) such that for any \(n \ge N,\)

\[ 0 \le \frac{1}{2}n^b \le (n + a)^b \le \frac{3}{2}n^b. \]

3.1-3

Explain why the statement, “The running time of algorithm \(A\) is at least \(O(n^2)\),” is meaningless.

Sol’n

One might say every function \(f\) is at least \(O(n^2).\) Consider

\[ g(n) = \min\{f(n), n^2\}, \]

then \(f \ge g \in O(n^2).\)

3.1-4

Is \(2^{n + 1} = O(2^n)\)? Is \(2^{2^n} = O(2^n)\)?

Sol’n

For the first, we have \(2^{n + 1} \le c\cdot 2^n\) for \(c = 2.\)

For the latter, suppose there is \(c\) such that \(2^{2^n} \le c\cdot 2^n.\)

Taking \(\lg\) of both sides, we have \(2^n \le \lg c + n,\) which clearly does not hold for large \(n.\)

3.1-5

Prove Theorem 3.1.

Sol’n

For convenience, we reproduce the theorem here.

Theorem 3.1

For any two functions \(f(n)\) and \(g(n)\), we have \(f(n) = \Theta(g(n))\) if and only if \(f(n) = O(g(n))\) and \(f(n) = \Omega(g(n)).\)

Proof:

The statement \(f(n) = \Theta(g(n)),\) is equivalent to the statement that there are \(c_1, c_2, N\) such that for any \(n \ge N,\) we have

\[0 \le c_1 g(n) \le f(n) \le c_2 g(n).\]

This is in turn equivalent to the statement that \(f(n) = \Omega(g(n))\) and \(f(n) = O(g(n)).\)

(For one direction, we assume existence of \(N_1, N_2,\) and consider \(N = \max \\{N_1, N_2\\}.)\)

3.1-6

Prove that the running time of an algorithm is \(\Theta(g(n))\) if and only if its worst-case running time is \(O(g(n))\) and its best-case running time is \(\Omega(g(n)).\)

Sol’n

TODO

to typeset:


3.1-7 Prove that o.g.n//  !.g.n// is the empty set. 3.1-8 We can extend our notation to the case of two parameters n and m that can go to infinity independently at different rates. For a given function g.n; m/, we denote by O.g.n; m// the set of functions O.g.n; m// D ff .n; m/ W there exist positive constants c, n0, and m0 such that 0 f.n;m/ cg.n;m/ foralln n0 orm m0g: Give corresponding definitions for .g.n; m// and ‚.g.n; m//.