BIG-O NOTATIONS

Big-O notations are used to represent Space or Time complexities of algorithms. There’re more notations such as Theta and Omega. But we’ll restrict the discussion to the frequently used Big-O notation.

Here are some terminology and complexity in Big-O notations you should keep in mind.

Complexity |   Big-O notation
-----------------------------
 Constant time	|  O(1)
 Leaner time	|  O(n)
 Quadratic	|  O(n^2)
 Logarithmic	|  O(log n)
   
- n is the input size

BIG-O NOTATIONS – GRAPHICAL – YOU ALREADY KNOW IT

If I remind you, you are already familiar with these terms constant time, Linear, and quadratic etc. From your college you have learned about how to draw graphs such as the linear equation, parabolic and constant line using x-axis and y-axis.

For example, constant line y = x, linear equation y = mx, quadratic equation y = x^2 etc.

Big-O notation also represent the same. For example, f (n )= n+1, represent linear time O(n), f(n )= 5, represents constant time O(1).

Interesting read: time complexity of the for loop with O(1) , O(n) and O(log n).

In case of algorithm’s complexity, we can represent a graph considering “number of inputs to an algorithm” on x-axis, and time measurement (time for number of operations taken by code) to run an algorithm on y-axis.

Here is some graphical representation of time complexities.

Look at them and ask yourself if you are familiar with these graphs as shown below. It shows graph about constant time, linear time and quadratic etc. This shows the number of operations N versus input size n for each function

This image is from wikipedia. where you can see big-o notations for more functions.

BIG-O NOTATION ORDER

As a reminder, you should understand and notice which Big-O notation is greater or lesser (BEST).

“The lesser Big-O is good”

So, here is a small comparison list form GOOD TO BAD running time:

O (1) < O (log n) < O(n) < O(n log n) < O(n^2) < O( 2^n) < O(n!)

Leave a Comment