Algorithm is defined as a step by step procedure for solving a problem in a finite amount of time.

An algorithm is said to be more efficient if it has small running time and it occupies less memory. The running time of an algorithm typically grows with the input size.The most common metric for calculating time complexity is Big O notation.

Big O Notation:

Big O Notation:

**Big O notation is described as O (type**

Big O notation usually describes the worst case running time of an algorithm. The Big O notation removes all constant factors. so that the running time can be estimated in relation to N.

**1.O(1)-Constant time**

O(1) describes an algorithm that will always execute in the same time (or space) regardless of the size of the input data set .This means that the algorithm requires the same fixed number of steps regardless of the size of the task.

In algorithm,

**statement;**

Is constant. The running time of the statement will not change in relation to N.

eg:

bool IsFirstElementNull(String[] strings)

{

if(strings[0] == null)

{

return true;

}

return false;

}

Examples

A. Push and Pop operations for a stack (containing n elements);

B. Insert and Remove operations for a queue.

**2.O(N) - linear time**

O(N) describes an algorithm whose performance will grow linearly and in direct proportion to the size of the input data set.

for ( i = 0; i < N; i++ )

statement;

The running time of the loop is directly proportional to N.Big O notation will always assume the upper limit where the algorithm will perform the maximum number of iterations.

Examples

A. Traversal of a list (a linked list or an array) with n elements;

B. Finding the maximum or minimum element in a list, or sequential search in an unsorted list of n elements;

C. Traversal of a tree with n nodes;

D. Calculating iteratively n-factorial; finding iteratively the nth Fibonacci number.

**3.O(n**

^{2}) - quadratic timeO(n

^{2}) represents an algorithm whose performance is directly proportional to the square of the size of the input data set.

for ( i = 0; i < N; i++ ) {

for ( j = 0; j < N; j++ )

statement;

}

The running time of the two loops is proportional to the square of N. When N doubles, the running time increases by N * N.

Examples:

A. Some more simplistic sorting algorithms, for instance a selection sort of n elements;

B. Comparing two two-dimensional arrays of size n by n;

C. Finding duplicates in an unsorted list of n elements (implemented with two nested loops).

**4.O(log n) - logarithmic time**

Binary search is a technique used to search sorted data sets. It works by selecting the middle element of the data set, essentially the median, and compares it against a target value. If the values match it will return success. If the target value is higher than the value of the probe element it will take the upper half of the data set and perform the same operation against it. Likewise, if the target value is lower than the value of the probe element it will perform the operation against the lower half. It will continue to halve the data set with each iteration until the value has been found or until it can no longer split the data set.

while ( low <= high ) {

mid = ( low + high ) / 2;

if ( target < list[mid] )

high = mid - 1;

else if ( target > list[mid] )

low = mid + 1;

else break;

}

The running time of the algorithm is proportional to the number of times N can be divided by 2. This is because the algorithm divides the working area in half with each iteration.

Examples:

A. Binary search in a sorted list of n elements;

B. Insert and Find operations for a binary search tree with n nodes;

C. Insert and Remove operations for a heap with n nodes.

**5.O(n log n) - "n log n " time**

void quicksort ( int list[], int left, int right )

{

int pivot = partition ( list, left, right );

quicksort ( list, left, pivot - 1 );

quicksort ( list, pivot + 1, right );

}

The running time consists of N loops (iterative or recursive) that are logarithmic, thus the algorithm is a combination of linear and logarithmic.

Examples:

A. More advanced sorting algorithms - quicksort, mergesort

**6.O(an) (a > 1) - exponential time**

if a=2,O(2n) denotes an algorithm whose growth will double with each additional element in the input data set. The execution time of an O(2n) function will quickly become very large.

Examples:

A. Recursive Fibonacci implementation

B. Towers of Hanoi

C. Generating all permutations of n symbols

Order of asymptotic behavior of the functions

O(l) < O(log n) < O(n) < O(n log n) < O(n2) < O(n3) < O(an)

**Big-O when a function is the sum of several terms:**

If a function (which describes the order of growth of an algorithm) is a sum of several terms, its order of growth is determined by the fastest growing term. In particular, if we have a polynomial

p(n) = aknk + ak-1nk-1 + … + a1n + a0

its growth is of the order nk:

p(n) = O(nk)

In general, doing something with every item in one dimension is linear, doing something with every item in two dimensions is quadratic, and dividing the working area in half is logarithmic. There are other Big O measures such as cubic, exponential, and square root, but they're not nearly as common.The best time in the above list is obviously constant time, and the worst is exponential time which, as we have seen, quickly

overwhelms even the fastest computers even for relatively small n. Polynomial growth (linear, quadratic, cubic, etc.) is considered manageable as compared to exponential growth.

Hi Suneeta,

ReplyDeleteFabolous tutorial altogether, good one related to Algorithm. But it would be better how the Big O notation functions with smaller coding examples.

Thanks

Swaroop (swaroop.jayanthi@gmail.com)

Hi Sunitha,

ReplyDeleteVery nice tutorial which covers all the data structures.

I felt it will be good if we have various Sortings & Searchings elaborated along with their complexities.

Thanks,

Swetha

Pretty niсе post. I just stumbled upon your

ReplyDeleteweblog аnd wanted to sаy that I've really enjoyed browsing your blog posts. After all I will be subscribing to your rss feed and I hope you write again soon!

Feel free to visit my homepage ... thoi trang cong so