When it comes to analyzing the complexity of an algorithm on the basis of time and space complexity, one can never provide an exact number to define the time and space required by an algorithm, instead we express it using some standard notation called asymptotic notation.

**Asymptotic notation of an algorithm is a mathematical representation of its complexity.**

**Note:**

In asymptotic notation, we only consider the most significant terms in the complexity of that algorithm and ignore the least significant terms in the complexity of that algorithm.

**For example, consider the following complexities between two algorithms.**

- Algorithm 1: 5n^2 + 2n +1
- Algorithm 2: 10n^2 + 8n + 3

Generally, when we analyze an algorithm, we consider the complexity for larger values of input data(i.e. ‘n’ values). In the above example (2n +1) and (8n +3) has least significance than (5n^2) and (10n^2) respectively.

Here, for larger values of ‘n’ the value of most significant term is very large than the value of the least significant terms. So for larger values of ‘n’ we ignore the least significant terms to represent the overall time required by an algorithm.

**Types of asymptotic notations:**

There are three main types of asymptotic notation. They are as follows:

- Big- O(O)
- Big-Omega
- Big-Theta

**Upper Bound-Big-Oh Notation(O):**

- This notation is used to represent the upper bound of an algorithm in terms of time complexity.
- It means maximum amount of time an algorithm can take for all input values that is the worst case of an algorithm time complexity.

**Example:**

Consider a case of a integer array and we perform linear search operation on same. There can be three scenarios:

- If the element to be searched is present as the last element in the array, then the time complexity of algorithm is n, where ‘n’ is the size of the array.
- If the element to be searched is present as the very first element in the array, the time complexity is 1.
- If the element to be searched is present somewhere in between, the complexity would be ‘n/2’.

Now each of the case has a special notation to represent that is **upper bound(Big O)**, **Lower Bound(Theta)** and **tight bound(Theta notation)** respectively .

**Lower-Bound-Big omega:**

- This is used to define the best case of an algorithm that is lowest bound of any algorithm.
- It indicates the minimum time required for any algorithm for all input values.
- In simple word it means that algorithm will take atleast this much time to complete execution.
- For above example the best case would be if the element to be searched is present as the very first element. The time complexity will be Omega(1).

**Tight Bound-Theta:**

This represent the average case that is the average value or range within which the actual time of execution of an algorithm will be.

For example, if for some algorithm the time complexity is represented by the expression 3n^2 + 5n, and we use the Big-O notation to represent this, then the time complexity would be Theta(n^2), ignoring the constant coefficient and removing the insignificant part, which is 5n.

Here, in the example above, complexity of Θ(n^{2}) means, that the average time for any input `n`

will remain in between, `k1 * n`

and ^{2}`k2 * n`

, where k1, k2 are two constants, thereby tightly binding the expression representing the growth of the algorithm.^{2}

Note:

Even though there are three notation to represent the complexities but we generally use the big-O notation that is the upper bound( worst case) as stating this will mean that it will never exceed ‘n’, it can be less than or equal to ‘n’, which is correct representation.

**Relevant article:**

Categories: Data Structure and Algorithm

## Leave a Reply