In Asymptotic analysis, We evaluate the performance of an algorithm by the input size, not by the actual running time.
For example, if we want to search an item from an sorted array, we can do this by two ways:
So, is it always gives perfect result?
No. But that's the best way available to evaluate an algorithm. Let's say, two algorithms take
In asymptotic analysis, we always ignore constants. So, both of the above becomes:For example, if we want to search an item from an sorted array, we can do this by two ways:
- Linear Search - Takes linear time.
- Binary Search -Takes logarithmic time
So, is it always gives perfect result?
No. But that's the best way available to evaluate an algorithm. Let's say, two algorithms take
- 2000nLogn
- 5nLogn
- nLogn
- nLogn
Cases: Worst, Best, and Average
There are three ways to evaluate an algorithm.
- Worst Case (Most cases we do Worst case analysis)
- Average case (Sometimes)
- Best Case (Usually not done to evaluate an algorithm)
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 | ''' Linear search example Time complexity O(n) ''' def search_item(array_list, search_element): found = False for i in range(len(array_list)): if array_list[i] == search_element: found = True print(f"Element {search_element} found at position {i+1}") break if found == False: print(f"{search_element} is not found in the array list") if __name__ == "__main__": array = [4, 2, 8, 9, 3, 7] search_element = 9 search_item(array, search_element) ''' output: Element 9 found at position 4 ''' |
Worst Case Analysis:
In this case, we calculate the upper bound of running time, cases that take maximum number of operations. Like, if anyone asks, what will be the maximum time it will take to be executed?
We answer this question by doing worst case analysis.
In Linear Search, worst case happens when the searched item is not present in the List/ Array.
In this case, every elements need to compare with the searched item. So, it takes time equal to number of List / Array elements.
Here, time complexity would be: ϴ(n). for total n elements.
Average Case Analysis:
In this case, we take all possible inputs and calculate time for all the inputs. Then, we sum all the calculated values and divide it by the total number of inputs. This is not easy to do, we have to know all mathematical distributions of input values. It is rarely done.
Best Case Analysis:
We calculate the lower bound of running time, cases that take minimum number of operations.
In Linear Search, Best case happens when the searched item present in the 1st position of a List / Array.
Here, time complexity would be: ϴ(1)
As the lower bound of an algorithm does not give any important information, programmers usually don't use it to evaluate an algorithm.
Most of the time Worst Case Analysis is used to evaluate an algorithm.
In some cases, few algorithms are asymptotically same. For example, Merge Sort.
In all cases, it does ϴ (nLogn) .
Comments
Post a Comment