Before defining the actual term complexity, let us discuss about few real life scenarios. Take an example of railway reservation counter, people go there to book their tickets. The time to book tickets will depend on how many service windows are available, queue size and time taken by each representative. A person will either go in a lane which is very short or will stand in queue where representative is very fast. Take another example of sitting plan of 40 students. The first thing to be considered is the no. of chairs needed and second thing is their order of sitting. If they sit roll.no wise then it will be easy for teacher to take attendance and time required will be lesser. Thus every process in the real world depends on how much time it takes to execute and how much space it consumes. Complexity is defined as the running time to execute a process and it depends on space as well as time.
It is of two types:
- Space Complexity
- Time Complexity
Basic operation means the main operation that will be required to solve particular problem. For example the basic operation in searching is comparison.The complexity depends on the basic operation.
The focus to determine the cost is done on running time i.e time complexity and it depends on the following factors
- Size of Input Data
- Operating System
- Programming Language used
The amount of computer memory required to solve the given problem of particular size is called as space complexity. The space complexity depends on two components:
- Fixed Part - It is needed for instruction space i.e byte code, variable space, constants space etc.
- Variable Part - Instance of input and output data.
- Space(S) = Fixed Part + Variable Part
Two computer words will be required to store variable x and y so ,
Space Complexity (S)= 2.
Algorithm sum( a, n)
sum = sum + a[i];
Space complexity is calculated as follows
- One computer word is required to store n.
- n space is required to store array a.
- One space for variable sum and one for i is required.
- Total Space(S) =1 +n +1 +1=n+3
The time required to analyze the given problem of particular size is known as the time complexity. It depends on two components:
- Fixed Part - Compile time
- Variable Part - Run time dependent on problem instance.Run time is considered usually and compile time is ignored.
Ways to measure time complexity
- Use a stop watch and time is obtained in seconds or milliseconds.
- Step Count - Count no of program steps.
- Comments are not evaluated so they are not considered as program step.
- In while loop steps are equal to the number of times loop gets executed.
- In for loop, steps are equal to number of times an expression is checked for condition.
- A single expression is considered as a single step. Example a+f+r+w+w/q-d-f is one step.
- Rate of Growth(Asymptotic Notations)
Example of step count
Algorithm sum( a, 5)
sum = sum + a[i];
Step count is a difficult approach if we need to compare results. For example: If we used two techniques to solve one problem and T(1) is the running time of first technique and T(2) is the running time of second technique. Say T(1) =( n+1 )and T(2) is (n2 + 1). We cannot decide which one is the better solution,so for comparisons 'Rate of growth' i.e. asymptotic notations of time space complexity functions are much convenient to use.
These notations gives the rate of growth.Instead of dealing with exact expressions we deal with asymptotic behavior.Asymptotic means to approach a value or curve arbitrarily closely.Focus is on the exponential behavior of equation.If we have expression like T(n)=3n3 +5 and T(n)=7 n3 +10n2+5 we will consider on highest power i.e n3.The analysis will not really change as these both equations are similar to n3.But on contrary we need to separate our results if analysis are like T(n) = 3n3 +5 and T(n)=10n2+5 as highest powers are different.The three asymptotic notations are:
Cases in complexity
- Best Case
- Average Case
- Worst Case
It is the minimum time required to solve the given problem of particular size.For e.g.in linear search if element is found at first location then it will be the best case scenario and complexity will be O(1) as the running time is minimum.
It is average cost and time required for a given problem of particular size.For e.g. in linear search if there are n elements and item is found at n/2 location,then it will be a average case scenario and complexity will be O(n/2).
The maximum time required to solve a given problem of particular size.For e.g in linear search the worst case will either be item is present at last location or it is not present.The complexity will be O(n).