Big Omega(Ω) Notation

Ω Notation provides the asymptotic lower bound on the function. A function needs to satisfy some conditions in order to be in Ω notation.Ω(g(n)) is the best case running time  of function g(n) within a constant factor.

Ω(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0}.It is used to bound the best case running time of an algorithm.g(n) is an asymptotic lower bound for f(n).

Screen Shot 2014-07-28 at 3.45.31 PM


Example 1

  • Prove that if  f(n) = 15n3 + n2 + 4, then f(n) belongs to Ω(n3 ) for all n.
  • For omega notation,c*g(n) should be less than f(n).
  • Here f(n) = 15n3 + n2 + 1
  • g(n) = n3
  • Let c=15
    •  15  n3 <=  15n3 + n2 + 1
    • n=1 
  • f(n) belongs to Ω class.
  • Ω is the lower bound.

Example 2

  • Show that 2n +7 is Ω(n) .
  • f(n) = 2n + 7
  • g(n) = 2n
  • c * g(n) <= f(n)
    • c*2n <=2n+7
    • Let c=2
    • 4n<=2n+7
    • 2n<=7
    • n<=7/2
  • Thus for c=2 and  n0= 7/2 f(n) belongs to Ω(n).



Short URL:
By Cusp2207 on March 4, 2016 | Algorithms, Computer Science | A comment?

Big O Notation


  • It gives rate of growth of step count function(f(n)) in terms of simple function(g(n)) and defines the asymptotic upper bound on function(Fig 1).
  • To simplify the estimation for running time,constants and lower order terms are ignored.
  • The function needs to satisfy following conditions in order to be in Big-O class.Given functions f(n) and g(n),we say f(n) is  O(g(n)) if and if
    • f(n) < c * g(n)  for all n > n0
    • where c  and n0 are positive constants.
  • The function nis does not belong  to O(n) rather it belongs to O( n2 ) as it fails to satisfy the inequality f(n)<=cg(n).n2  cannot be less than n in any case as c is constant.
  •  O(..) is worst case (upper bound) notation for an algorithm’s complexity .
Fig 1: Big O Notation

Fig 1: Big O Notation


Example 1

  • Consider T(n)= 3n+2
  • g(n) = n
  • f(n) = 3n+2
  • We need to find first value of c where answer comes positive.Let c=1
  • f(n)<=c * g(n)
  • 3n+2 <= c *n
  • 3n+2 <= 1*n
  • The answer comes in negative,so we cant choose c=1.
  • Let c=2
  • 3n+2 <= 2n The answer again comes in negative,so we cant choose c=2.
  • c=3
  • 3n+2<=3n,can’t choose c=3
  • c=4
  • 3n+2<=4n
  • n>=2
  • n0=2 and c=3.This is the point where f(n) and g(n) meets.
  • Hence f(n)<=c*g(n) and thus belongs to big-O class.
  • Highest power of n is 1.
  • So f(n) belongs to O(n).

Example 2

  • Consider  T(n) = 2n2 + 4
  • g(n)=n
  • f(n) = 2n2 + 4
  •  2n2 + 4<=c *n
  • The inequality is false for c=1 and c=2.
  • c=3

2n2 + 4<=3 *n


n=2 and n=-2,negative value is neglected.

  • n>=2,n0=2 ,c=3
  • Thus f(n) belongs to O(n) .

Example 3

  • T(n) = 3n3+20n2+ 5
  • 3n3+20n2+ 5 <=c * n
  • Is true for c=4 and n= 21
  • Thus 3n3+20n2+ 5 belongs to O(n3).

Example 4

  • T(n) = 3  log n +5
  • g(n) = log n
  • f(n) = 3 log n +5
  • f(n) <= c *g(n),c>0 and  n > n0
  • 3 log n + 5 <=c * log n
  • c =8 and n= 2
  • Thus 3n3+20n2+ 5 belongs to O(n3

Complexities of algorithms

The various big 0 complexities are given in Fig 2 and Fig 3.

Screen Shot 2014-07-25 at 6.27.09 PM

Fig 2: Complexities of sorting algorithms

Screen Shot 2014-09-11 at 12.56.46 PM

Fig 3 : Complexities of sorting algorithms




Short URL:
By Cusp2207 on July 26, 2014 | Algorithms, Computer Science | A comment?

Time and Space Complexity

Before defining the actual term complexity, let us discuss about few real life scenarios.Take an example of railway reservation counter.People go there to book their tickets.The time to book tickets will depend on  how many service windows are available, queue size and time taken by each representative.A person will either go in a lane which is very short or will stand in queue where representative is very fast.Take another example of sitting plan of 40 students.The first thing to be considered is the no. of chairs needed and second thing is their order of sitting.If they sit wise then it will be easy for teacher to take attendance and time required will be lesser.Thus every process in the real world depends on how much time it takes to execute and how much space it consumes.Complexity is defined as the running time to execute a process and it depends on space as well as time.

It is of two types:

  • Space Complexity
  • Time Complexity

Basic Operation

Basic operation means the main operation that will be required to solve particular problem.For example the basic operation in searching is comparison.The complexity depends on the basic operation.

The focus to determine the cost is done on running time i.e time complexity and it depends on the following factors

  • Size of Input Data
  • Hardware
  • Operating System
  • Programming Language used


Space Complexity

The amount of computer memory required to solve the given problem of particular size is called as space complexity.The space complexity depends on two components

  •  Fixed Part – It is needed for instruction space i.e byte code.variable space,constants space etc.
  •  Variable Part – Instance of input  and output data.
  • Space(S) = Fixed Part + Variable Part

Example 1

Algorithm sum(x,y)


return x+y;


Two computer words will be required to store variable x and y so ,

Space Complexity (S)= 2.

Example 2

Algorithm sum( a[], n)


sum =0;



sum = sum + a[i];

return sum;


Space complexity is calculated as follows

  • One computer word is required to store n.
  • n space is required to store array a[].
  • One space  for variable sum and one for i is required.
  • Total Space(S) =1 +n +1 +1=n+3

Time Complexity

The time required to analyze the given problem of particular size is known as the time complexity.It depends on two components

  • Fixed Part – Compile time
  • Variable Part – Run time dependent on problem instance.Run time is considered usually and compile time is ignored.


Ways to measure time complexity

  • Use a stop watch and  time is obtained in seconds or milliseconds.
  • Step Count - Count no of program steps.
    • Comments are not evaluated so they are not considered as program step.
    • In while loop steps are equal to the number of times loop gets executed.
    • In for loop,steps are equal to number of times an expression is checked for condition.
    • A single expression is considered as a single step.Example a+f+r+w+w/q-d-f is one step.
  • Rate of Growth(Asymptotic Notations)


Example of step count

Algorithm sum( a[], 5)


sum =0;



sum = sum + a[i];


return sum;



Screen Shot 2014-07-24 at 4.29.43 PM

Fig 1 : Step Count

Step count is a difficult approach if we need to compare results.For e.g.If we used two techniques to solve one problem and T(1) is the running time of first technique and T(2) is the running time of second technique.Say T(1) =( n+1 )and T(2) is (n2 + 1).We cannot decide which one is the better solution,so for comparisons ‘Rate of growth‘  i.e. asymptotic notations of time space complexity functions are much convenient to use.


Asymptotic Notations

These notations gives the rate of growth.Instead of dealing with exact expressions we deal with asymptotic behavior.Asymptotic means to approach a value or curve arbitrarily closely.Focus is on the exponential behavior of equation.If we have expression like T(n)=3n3 +5 and T(n)=7 n3 +10n2+5 we will consider on highest power i.e n3.The analysis will not really change as these both equations are similar to n3.But on contrary we need to separate our results if analysis are like T(n) = 3n+5 and T(n)=10n2+5 as highest powers are different.The three asymptotic notations are:

Cases in complexity

  • Best Case
  • Average Case
  • Worst Case

Best Case

It is the minimum time required to solve the given problem of particular size.For linear search if element is found at first location then it will be the best case scenario and complexity will be O(1) as the running time is minimum.

Average Case

It is average cost and time required for a given problem of particular size.For e.g. in linear search if there are n elements and item is found at n/2 location,then it will be a average case scenario and complexity will be O(n/2).

Worst Case

The maximum time required to solve a given problem of particular size.For e.g in linear search the worst case will either be item is present at last location or it is not present.The complexity will be O(n).

Short URL:
By Cusp2207 on July 25, 2014 | Algorithms, Computer Science | A comment?