Complexity

Complexity:

The term complexity is defined as the time or space requirement of an algorithm to be executed completely in terms of the input size.

TIME and SPACE are two major requirements of efficiency of an algorithm.

TIME: Measured by counting the number of key operations.

SPACE: Measured by counting the maximum of memory needed by the algorithm.

TIME-SPACE TRADEOFF:

By increasing the amount of space for storing the data, one may be able to reduce the time needed for processing the data, or vice versa.

Cases of Complexity:

  1. Worst Case: The maximum value of a function (consider algorithm) for any possible input.
  2. Average Case: The expected or average value of a function.
  3. Best Case: Minimum possible value of a function.

Standard Functions of Complexity:

  1. log2n
  2. n
  3. nlogn
  4. n2
  5. n3
  6. 2n

(Arranged in increasing order)

Asymptotic Notations:

These are languages that allow us to analyse an algorithm’s running time by identifying its behaviour as the input size for the algorithm increases. This is also known as an algorithm’s growth rate.

BIG-OH (O) Notation:

Big-oh defines the upper bound of a particular function (here specifically algorithm).

FORMULA: f(n)=O(g(n)) if f(n)<=C g(n) for n>=n0 ; C and n0 are constants.

ALSO CALLED THE WORST CASE OF AN ALGORITHM

 

Big-Omega (Ω) Notation:

Big-omega defines the lower bound of a particular function (here specifically algorithm).

     FORMULA: f(n)= (g(n)) if f(n)>=C g(n) for n>=n0 ; C and n0 are constants.

ALSO CALLED THE BEST CASE OF AN ALGORITHM

 

Big-Theta (Ө) Notation:

Big-theta defines the tight bound of a particular function (here specifically algorithm).

     FORMULA: f(n)= Ө (g(n)) if C1(n) <= f(n) <= C2 g(n) for n >= n0; C1, C2 and n0 are constants.

f(n) = Ө (g(n)), iff f(n) = O(g(n)) and f(n) = Ω (g(n))

ALSO CALLED THE AVERAGE CASE OF AN ALGORITHM

 

Little-Oh (o) Notation:

Big-theta defines the non-tight analogue of Big-oh.

FORMULA: f(n) = o (g(n)), if for every c, there exists n0 such that, f(n) <= c g(n) for n >= n0

Used for comparisons of running times.

 

PROBLEM FOR FINDING COMPLEXITY:

Problem1:

function ()

{

if (condition)

{

for (i=0; i<n; i++)

{ // simple statements}

}

else

{

for (j=1; j<n; j++)

for (k=n; k>0; k–)

{// simple statement}

}

}

 

Complexity:

  1. Check the for If for statement is increasing serially i.e i=0,1,2,3… then write n for that loop.
  2. Check again the other for. It is also increasing serially so n.
  3. The third loop is also making n, but it is inside the second loop so it will contribute n2.

THEREFORE COMPLEXITY:

n+n2 (considering only the highest power of n).

ANSWER: n2

 

Author: Udipto Goswami

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s