There are five descriptors of runtime complexity. Note that they all are accounting for worst case scenario. They are as such:

- $O(f(n))$ - Ambiguous upper bound
- Function will always be less than some scale of $f(n)$

- $\Omega(f(n))$ - Ambiguous lower bound
- Function will always be greater than some scale of $f(n)$

- $\Theta(f(n))$ - Tight bound
- Function is both $O(f(n))$ and $\Omega(f(n))$

- $o(f(n))$ - Descriptive upper bound
- Function is $O(f(n))$ but not $\Omega(f(n))$

- $\omega(f(n))$ - Descriptive lower bound
- Function is $\Omega(f(n))$ but not $O(f(n))$

### Proof Methods

- Prove by obvious bound
- If trying to prove an upper or lower bound for a clear function, set the LHS to
*at most*the largest degree polynomial

- If trying to prove an upper or lower bound for a clear function, set the LHS to
- Prove that bound does not apply(for $\omega$ or $o$)
- Use induction contradiction

- You can manipulate both sides of the inequality

### Properties of Growth Functions

- If $f(n) = O(g(n))$, and $g(n) = O(h(n))$, then $f(n) = O(h(n))$
- Upper bound of upper bound still applies

- If $f(n) = \Omega(g(n))$, and $g(n) = \Omega(h(n))$, then $f(n) = \Omega(h(n))$
- Lower bound of lower bound still applies

- If $f(n) = \Theta(g(n))$, and $g(n) = \Theta(h(n))$, then $f(n) = \Theta(h(n))$
- Tight bound of tight bound still applies

- If $f(n) = O(h(n))$ and $g(n) = O(h(n))$, then $f(n)+g(n) = O(h(n))$

Here are some bounds of common functions

- For a standard polynomial with degree d, $f = O(n^d)$
- For $b>1$ and any real numbers $x, y > 0$, $(log_bn)^x=O(n^y)$
- For every $r > 1$ and every $d > 0$, $n^d = O(r^n)$