__Complexity:__

The term complexity is defined as the** time or space requirement** of an algorithm to be executed completely in terms of the input size.

**TIME** and **SPACE **are two major requirements of efficiency of an algorithm.

TIME: Measured by counting the number of **key operations**.

SPACE: Measured by counting the **maximum of memory** needed by the algorithm.

__TIME-SPACE TRADEOFF:__

By increasing the amount of space for storing the data, one may be able to reduce the time needed for processing the data, or vice versa.

__Cases of Complexity:__

**Worst Case:**The maximum value of a function (consider algorithm) for any possible input.**Average Case:**The expected or average value of a function.**Best Case:**Minimum possible value of a function.

Standard Functions of Complexity:

- log
_{2}n - n
- nlogn
- n
^{2} - n
^{3} - 2
^{n}

(Arranged in increasing order)

__Asymptotic Notations:__

These are languages that **allow us to analyse an algorithm’s running time** by identifying its behaviour as the input size for the algorithm increases. This is **also known as an algorithm’s growth rate.**

__BIG-OH (O) Notation:__

Big-oh defines the **upper bound** of a particular function (here specifically algorithm).

**FORMULA: **f(n)**=O(**g(n)**) if **f(n)**<=C **g(n) for n>=n_{0} ; C and n_{0 }are constants.

ALSO CALLED THE **WORST CASE** OF AN ALGORITHM

__Big-Omega (Ω) Notation:__

Big-omega defines the **lower bound** of a particular function (here specifically algorithm).

** FORMULA: **f(n)**=** Ω** (**g(n)**) if **f(n)**>=C **g(n) for n>=n_{0} ; C and n_{0 }are constants.

ALSO CALLED THE **BEST CASE** OF AN ALGORITHM

__Big-Theta (Ө) Notation:__

Big-theta defines the **tight bound** of a particular function (here specifically algorithm).

** FORMULA: **f(n)**=** Ө** (**g(n)**) if ****C _{1}**(n) <= f(n) <=

**C**g(n) for n >= n

_{2}_{0};

**C**and n

_{1,}C_{2 }_{0 }are constants.

f(n) = Ө (g(n)), iff f(n) = O(g(n)) and f(n) = Ω (g(n))

ALSO CALLED THE **AVERAGE CASE** OF AN ALGORITHM

__Little-Oh (o) Notation:__

Big-theta defines the **non-tight analogue** of **Big-oh**.

** FORMULA: **f(n) = o (g(n)), if for every c, there exists n_{0 }such that, f(n) <= c g(n) for n >= n_{0}

Used for comparisons of running times.

**PROBLEM FOR FINDING COMPLEXITY:**

**Problem1:**

function ()

{

if (condition)

{

for (i=0; i<n; i++)

{ // simple statements}

}

else

{

for (j=1; j<n; j++)

for (k=n; k>0; k–)

{// simple statement}

}

}

**Complexity:**

- Check the
**for**If for statement is increasing serially i.e i=0,1,2,3… then write**n**for that loop. - Check again the other for. It is also increasing serially so
**n**. - The third loop is also making
, but it is inside the second loop so it will contribute*n***n**.^{2}

**THEREFORE COMPLEXITY:**

**n+n ^{2 } (considering only the highest power of n).**

**ANSWER: n ^{2}**

Author: Udipto Goswami