Ω Notation provides the asymptotic lower bound on the function. A function needs to satisfy some conditions in order to be in Ω notation.Ω(g(n)) is the best case running time of function g(n) within a constant factor.

Ω(g(n)) = {f(n): there exist positive constants c and n_{0} such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n_{0}}.It is used to bound the best case running time of an algorithm.g(n) is an asymptotic lower bound for f(n).

Example 1

- Prove that if f(n) = 15n
^{3}+ n^{2}+ 4, then f(n) belongs to Ω(n^{3 }) for all n. - For omega notation,c*g(n) should be less than f(n).
- Here f(n) = 15n
^{3}+ n^{2}+ 1 - g(n) = n
^{3} - Let c=15
- 15 n
^{3}<= 15n^{3}+ n^{2}+ 1 _{n0 =1 }

- 15 n
- f(n) belongs to Ω class.
- Ω is the lower bound.

Example 2

- Show that 2n +7 is Ω(n) .
- f(n) = 2n + 7
- g(n) = 2n
- c * g(n) <= f(n)
- c*2n <=2n+7
- Let c=2
- 4n<=2n+7
- 2n<=7
- n<=7/2

- Thus for c=2 and n
_{0}= 7/2 f(n) belongs to Ω(n).

- It gives
**rate of growth**of step count function**(f(n))**in terms of simple function**(g(n))**and defines the**asymptotic**upper bound on function(Fig 1). - To simplify the estimation for running time,
**constants**and**lower order terms**are ignored. - The function needs to satisfy following conditions in order to be in Big-O class.Given functions
**f(n) and g(n)**,we say**f(n)**is**O(g(n))**if and if**f(n) < c * g(n)**for all**n > n**_{0}- where
**c**and**n**are_{0 }**positive**constants.

- The function
**n**is does not belong to^{2 }**O(n)**rather it belongs to**O( n**as it fails to satisfy the inequality^{2 })**f(n)<=cg(n)**.**n**^{2 }cannot be less than**n**in any case as**c**is constant. -
**O(..)**is**worst case**(upper bound**) notation**for an algorithm’s**complexity**.

**Example 1**

- Consider T(n)=
**3n+2** **g(n) = n**- f(n) =
**3n+2** - We need to find first value of c where answer comes positive.Let
**c=1** - f(n)<=c * g(n)
- 3n+2 <= c *n
- 3n+2 <= 1*n
- The answer comes in
**negative**,so we cant choose c=1. - Let
**c=2** - 3n+2 <= 2n
^{ }The answer again comes in negative,so we cant choose c=2. **c=3**- 3n+2<=3n,can’t choose c=3
**c=4**- 3n+2<=4n
**n>=2****n**and_{0}=2**c=3**.This is the point where f(n) and g(n) meets.- Hence
**f(n)<=c*g(n)**and thus belongs to**big-O**class. - Highest power of n is 1.
- So
**f(n)**belongs to**O(n)**.

**Example 2**

- Consider T(n) =
**2n**^{2}+ 4 **g(n)=n**^{2 }- f(n) =
**2n**^{2}+ 4 - 2n
^{2}+ 4<=c *n^{2 } - The inequality is false for c=1 and c=2.
- c=3

2n^{2} + 4<=3 *n^{2 }

4<=n^{2 }

n=2 and n=-2,negative value is neglected.

**n>=2,n**_{0}=2 ,c=3- Thus
**f(n) belongs to O(n**.^{2 })

**Example 3**

- T(n) =
**3n**^{3}+20n^{2}+ 5 - 3n
^{3}+20n^{2}+ 5 <=c * n^{3 } - Is true for
**c=4**and**n**_{0 }= 21 - Thus
**3n**belongs to^{3}+20n^{2}+ 5**O(n**.^{3})

**Example 4**

- T(n) = 3 log n +5
- g(n) = log n
- f(n) = 3 log n +5
- f(n) <= c *g(n),
**c>0**and**n >****n**_{0} - 3 log n + 5 <=c * log n
**c =8**and**n**_{0 }= 2- Thus
**3n**belongs to^{3}+20n^{2}+ 5**O(n**^{3})

The various big 0 complexities are given in Fig 2 and Fig 3.

* *

Before defining the actual term complexity, let us discuss about few real life scenarios.Take an example of railway reservation counter.People go there to book their tickets.The time to book tickets will depend on how many service windows are available, queue size and time taken by each representative.A person will either go in a lane which is very short or will stand in queue where representative is very fast.Take another example of sitting plan of 40 students.The first thing to be considered is the no. of chairs needed and second thing is their order of sitting.If they sit roll.no wise then it will be easy for teacher to take attendance and time required will be lesser.Thus every process in the real world depends on how much** time** it takes to execute and how much space it consumes.**Complexity** is defined as the **running time** to execute a** process** and it depends on **space** as well as **time.**

It is of two types:

- Space Complexity
- Time Complexity

**Basic Operation**

Basic operation means the main operation that will be **required** to solve particular problem.For example the basic operation in searching is comparison.The complexity depends on the basic operation.

The focus to determine the **cost** is done on running time i.e** time complexity** and it depends on the following factors

- Size of Input Data
- Hardware
- Operating System
- Programming Language used

The amount of computer **memory** required to solve the given problem of particular **size** is called as **space** complexity.The space complexity depends on two components

**Fixed Part**– It is needed for instruction space i.e byte code.variable space,constants space etc.**Variable Part**– Instance of input and output data.**Space(S)**=**Fixed Part + Variable Part**

**Example 1**

*Algorithm sum(x,y)*

*{*

*return x+y;*

*}*

**Two computer words** will be required to store variable **x and y **so ,

**Space Complexity (S)= 2.**

**Example 2**

*Algorithm sum( a[], n)*

*{*

*sum =0;*

*for(i=0;i<=n;i++)*

*{*

*sum = sum + a[i];*

*return sum;*

*}*

Space complexity is calculated as follows

**One computer word**is required to store**n**.**n space**is required to store**array a[]**.**One space**for variable**sum**and**one**for**i**is required.**Total Space(S) =1 +n +1 +1=n+3**

The **time** required to **analyze** the given problem of particular** size** is known as the time complexity.It depends on two components

**Fixed Part**– Compile time**Variable Part**– Run time dependent on problem instance.Run time is considered usually and compile time is ignored.

- Use a
**stop watch**and time is obtained in**seconds or milliseconds**. **Step Count**- Count no of program steps.**Comments**are not evaluated so they are**not**considered as program**step**.- In
**while**loop**steps**are equal to the**number**of times loop gets**executed**. - In
**for**loop,steps are equal to number of times an**expression**is checked for**condition**. - A single expression is considered as a
**single**step.Example**a+f+r+w+w/q-d-f**is**one step**.

**Rate of Growth**(Asymptotic Notations)

**Example of step count**

*Algorithm sum( a[], 5)*

*{*

*sum =0;*

*for(i=0;i<=5;i++)*

*{*

*sum = sum + a[i];*

}

*return sum;*

*}*

Step count is a **difficult** approach if we need to **compare** results.For e.g.If we used two techniques to solve one problem and **T(1)** is the running time of first technique and **T(2)** is the running time of second technique.Say **T(1) =( n+1** **)**and **T(2)** is **(n ^{2 }+ 1)**.We cannot decide which one is the better solution,so for comparisons ‘

These notations gives the **rate of growth**.Instead of dealing with exact expressions we deal with asymptotic behavior.Asymptotic means to approach a value or curve arbitrarily closely.Focus is on the exponential behavior of equation.If we have expression like **T(n)=3n ^{3 }+5** and

- Big-O(O) – Refer Big O Notation
- Big-Theta(Ω)
- Big-Omega(Θ)

- Best Case
- Average Case
- Worst Case

It is the **minimum** time required to solve the given problem of particular size.For e.g.in **linear search** if element is found at first location then it will be the best case scenario and complexity will be **O(1)** as the running time is minimum.

It is **average** cost and time required for a given problem of particular size.For e.g. in** linear search** if there are n elements and item is found at n/2 location,then it will be a average case scenario and complexity will be **O(n/2)**.

The **maximum** time required to solve a given problem of particular size.For e.g in linear search the worst case will either be item is present at last location or it is not present.The complexity will be **O(n).**