All About Algorithms Part:1

Urvashi
4 min readJun 29, 2021

Hello Everyone…very excited to direct and produce my own original piece of blog 😃. So lets get started for an adventurous run-through of my thoughts.

Firstly introducing myself 👐 I am B tech CSE 2nd year student and have a secret passion towards teaching .Through my series of blogs I want to share my experience and knowledge in a very simple language which I am gathering in my journey because experiences are something on which a person spend his whole life in achieving it and takes a great pleasure mentioning it as it can solve problems of many..So then get set and go….

You know Best part being in field of computer science is exploration and analysis.Lets explore our first level “ALGORITHMS”. Algorithms are the code written in simple language representing the functioning of code or in simple words English version of syntax of code.

A problem can have have multiple solutions but which one to choose? ANY GUESSES?The one which fulfills our requirements i.e. its not always the one which takes less time or space is the best but there much more things to take into considerations.Before checking it out lets look what is time and space complexity?

Time complexity is the total running time of the algorithm so does it means different computer will have different complexities!!The answer is no because time complexity depends on the way of implementations rather than the hardware.Hardware do matter in case of computational works but the analysis of algorithm is wholly done on number of inputs and type of steps algorithm includes.It is the function of n and is called as order of growth because we are concerned about the behaviour of algorithm at larger values of n i.e. when the function becomes asymptotic meaning n -> infinity what matters here is the highest power of the function.Order of growth is thus a set of function which show same asymptotic behaviour for eg: for 2*n,10*n or any a*n +c form equation aymptotically indicate to n.Time complexity is denoted using these functions as input to function theta,o and omega.

Observe the general time complexities and how we have related it to the functions. Source: https://www.hackerearth.com/practice/notes/sorting-and-searching-algorithms-time-complexities-cheat-sheet/

Worst Case Complexity: Represented using O(Big o) which gives us the upper bound of the function which what really matters in real world our aim is always to tackle the worst case efficiently.In general we usually define the time complexity in Big o terms as logically if we know what is upper bound we get the some idea of limit of lower bound as well…

Average Case Complexity:Represented using Theta(g(n)) which gives the tight bound for function i.e. c1g(n)≤f(n)≤c2g(n) where c1 and c2 are positive constants.Here f(n) is the function followed by our algorithm and g(n) function provides a bound to that function.If we say time complexity of f(n)=theta(g(n)) thus you can visualize f(n) being sandwiched between functions related to g(n). Hence easily can get an idea about algorithm time taken by function isn’t it?

Best Case Complexity:Represented using omega(g(n)) which gives the lower bound for function i.e. c1g(n)≤f(n). Just think we should be considered with the worst case complexity as it gives the upper bound so whats the need of calculating the best one?? 🤔 Just to give an idea of how maximum efficient our algorithm can be at some instance of our input…

CAUTION: Its not that it takes less value of n but type of input for a particular problem that can yield the correct result in least time that least time is the best time..and same theory applies to worst case complexity.

Space Complexity :From the name its clear that it’s related to storage space.The notations are same as that of time complexity but instead of how time is consumed ,how space consumption increases with input size is considered.

Lets comeback to our analysis.So lets start with sorting algorithms!! There are majorly:- bubble,insertion,selection,merge,quick and heap sort.They all have different time complexities but which is the best?All are best and are used according to our need.For eg:-selection sort gives O(n²) time complexity in every case but is useful when swapping is expensive as it does not go around swapping things all the time and uses only a temporary variable as extra space hence space complexity is very less approximately O(1) u can say.

QuickSort real time graph using JFLAP with input size upto 100000.

So hope u guys liked the introduction to Algorithms and its analysis and found it interesting and productive.For any suggestions from your side please do comment.In following up blogs there will be more of applications like sortings,their optimal solutions and much more of important algorithms so stay Tuned….

RECOMMENDATIONS:For deep theory refer CLRS book on Analysis Of Algorithm.

--

--

Urvashi
0 Followers

A CSE working professional who loves to study and code :)