Complexity

To analyze performances of algorithms, we need to define some jargons.

Time complexity

The time complexity of an algorithm, $T(n)$ is a function that denotes running time of the algorithm depending on the size of input data. For example, let’s think about an algorithm that adds data with other data. You may smart enough to add any arbitrary two number in 1 second. Then, how many time will it took for 100 data? It will be 100 seconds. In this case, $T(n) = n$.

Big O notation

Big O notation is the method to guarantee the upper bound of algorithms’ perforamce. For any time complexity $T(n)$, we say $T(n)$ $=$ $O(g(n))$ if there is some positive constant $M$,$x_0$ such that $T(n)$ $\le$ $M g(n)$ for all $x \ge x_0$. For example, if $T(n)$ $=$ $n^3$ $+$ $10n^2$ $+$ $n$ $+$ $27$, $T(n)$ $=$ $O(n^3)$ for $M$ $=$ $39$, $x_0$ $=$ $1$. Notice that $T(n)$ $=$ $n^3$ $+$ $10n^2$ $+$ $n$ $+$ $27$ $\le$ $n^3$ $+$ $10n^3$ $+$ $n^3$ $+$ $27n^3$ $=$ $39n^3$ for $n$ $\ge$ $1$. In fact, it is enough to find the most steepest part in $T(n)$.

Asymptotically approximate

In fact, it can be choosed to be worse than expected in big O notation. For example, $T(n)$ $=$ $O(n^2)$ if $T(n)$ $=$ $O(n)$ in all cases. Therefore, we say that “An algorithm’s time complexity asymptotically approximates to $g(n)$” when $\lim\limits_{n \leftarrow \infty}\frac{T(n)}{Mg(n)} = 1$ for some positive constant $M$.

Big $\Omega$ notation

Big $\Omega$ notation is the opposite with the big O notation. It guarantess the lower bound of algorithms’ performance. For any time complexity $T(n)$, we say $T(n)$ $=$ $\Omega(g(n))$ if there is some positive constant $M$,$x_0$ such that $T(n)$ $\ge$ $M g(n)$ for all $x \ge x_0$.

Big $\Theta$ notation

Big theta notation is used when an algorithm has the same complexity for both upper and lower bounds. In other word, we say $T(n)$ $=$ $\Theta(g(n))$ if there is some positive constant $M1$,$M_2$,$x_0$ such that $M_2 g(n)$ $\le$ $T(n)$ $\le$ $M_1 g(n)$ for all $x \ge x_0$. However, it is hard to expect that algorithm has such a complexity. Some algorithms’ performance vary between the data itself. Therefore, some algorithm can’t have big $\Theta$ notation for their time complexity.

Amortized complexity

Amortized complexity is another measurement to analysis an algorithm. Many algorithms doesn’t have a nice $\Theta$ notation to explain the performance’s performance. It really depends on the situation. However, it can be pessimistic to use only big O notation. Therefore, amortized complexity measures a performance of an algorithm by dividing its complexity between executions. If some algorithm works $O(1)$ for $n$ times and it works $O(n^2)$ for $1$ time. Then, the amortized complexity of this algorithm is $O(\frac{1 \times n + n^2}{n + 1})$ $=$ $O(n)$.

Space complexity

Space complexity is another measurable tool for algorithms. It denotes how many memory spaces it uses. We use all of notations above to represent space complexity either.