Every algorithm needs some time to execute its instructions to perform the task. This computer time required is called **time complexity**.

For a particular problem, there can be multiple solutions, which solution is chosen depends upon time and space required for that algorithm. The algorithm with the least time and space is chosen.

Let us take this as one proper example of finding the square of a number. For this, the two most common solutions could be:**Solution-1:**

Run a loop for n time, starting with the number n and adding n to each iteration.

**Algorithm:**

Find the square of the number **‘n’**

```
for ( i =1 to n)
do n = n + n;
return n;
```

**Solution-2:**

Use the mathematical operator * to find the square.

**Algorithm:**

Find the square of the number **‘n’**

`return n *n`

Thus in the first approach it would run the loop for ‘n’ number of times whereas the second approach it just takes a single step. The second approach is better as it takes less time for any input.

**Time complexity quantifies the amount of time taken by an algorithm to run as a function of the length of the input.**

It is most commonly estimated by counting the number of elementary steps performed by an algorithm. As in the above example, one algorithm runs for ‘n’ number of times so the time complexity would be at ‘n’ atleast and as the value of ‘n’ increased the time taken also increases. While in the second approach the time complexity is constant. As its is not dependent on the value of ‘n’. The results will always compute the result in 1 step.

The time complexity of an algorithm is most commonly expressed using the big-O notation. It’s an asymptotic notation which represents time complexity. More details in the other section.

Since the algorithm’s performance may vary with different input data, hence for an algorithm we usually use the worst-case Time Complexity as it defines the maximum amount of time an algorithm can take.

**Calculating Time Complexity:**

To better understand this topic, we will be seeing some examples in this article and as an when we progress through this data structure topic.

The most common asymptotic notation for calculating time complexity is Big-O notation. This removes all the constant factors so that running time can be estimated in relation to N

**Example-1:**

```
for (i=0; i <N ; i++)
{
statement;
}
```

For the above example, the running time of the algorithm is directly proportional to the number of times the loop runs that is the value of N. It is linearly dependent on the value of N. Thus the time complexity is O(N).

**Example-2:**

```
for (i = 0; i< N; i ++)
{
for ( j = 0 ;j <N; j ++)
{
statement;
}
}
```

In the above example the running time of an algorithm is having quadratic relation with the value of ‘N’. The running time of the two loops is proportional to the square of ‘N’. When N doubles, the running time increases by N * N. Hence the time complexity is O(N^2).

**Example-3(Binary Search):**

```
while(low <= high)
{
mid = (low + high) /2;
if (search_element == arr[mid])
return arr[mid];
if (search_element < arr[mid])
high = mid -1;
else if (search_element > arr[mid])
low = mid +1;
else break;
}
```

This algorithm is based on ‘Divide and Conquer’ technique where the input is broken into small into smaller sets and computation is performed in each set. This algorithm will have Logarithmic Time Complexity which is O(logN).

In the best case, the search element may be found in the very first iteration so the time complexity would be O(1).

**Example-4:(Quick Sort):**

```
void quicksort ( int arr[], int left, int right)
{
int pivot = partition(Arr, left, righty);
quicksort(arr, left, pivot -1);
quicksort (arr, pivot + 1, right);
}
```

In the quick sort we again divide the input array into halves every time, but at the same time we repeat this ‘N’ number of times. Hence the time complexity is O(NlogN). The running time consists of N loops(iterative or recursive) that are logarithmic, thus the algorithm is a combination of linear and logarithmic.

**Relevant article:**

- Introduction: Data structure and algorithm
- Algorithmic Analysis
- Space Complexity
- Time Complexity
- Asymptotic Notation

Categories: Data Structure and Algorithm

## Leave a Reply