Complexity or Asymptotic Behavior
Describes the effort given an input of size.
This is used to analyze an algorithm efficiency as your input goes to . (Growth Rate)
How drastically the time grows as the input grows.
In the real world
Memory growth is not computationally free. In languages like Go
or Javascript
you pay even heavier penalties because the memory can be kept around, grows faster, and causes complete halts in the program for cleanup.
But these things are not considered for Complexity calculations.
Rules:
Ignore constants:
Constants are any value that do not scale with the input.
Constant values are ignored.
Ex.: , the is ignored.
Ignore lower order terms:
Ex.: , the is ignored.
Ex.: , is also ignored.
Big-O
In the code evaluation, you must always consider the costliest paths.
Also Big-O states that a given function will be of a complexity of NO MORE THAN some asymptotic notation.
It can be less.

O(1)
Constant - Excellent
Given a input , the cost will always be constant , no matter how big is.
O(log N)
Logarithm - Excellent
When you divide the data and analyses just part that needs.
It is much closer to Constant Time
than Linear
.
Ex.:
Search for a name in a list. You don't have to look the entire list, but start from the names where the first letter is the same as the one you want. Ex.: Binary Search.
O(N)
Linear - Good
The cost is linearly proportional as the input.
Recursions end up being just like regular loops.
O(N log N)
Linearithmic ou Quasilinear - Bad
As if it were executed times.
Ex.:
Some search algorithms like Merge Sort.
O(N²)
Quadratic - Horrible
The cost rises quadratically as .
Ex.:
Nested loops
O(2^N)
Exponential - Horrible
The cost rises exponentially as .
Ex.:
Hanoi Tower.
O(N!)
Factorial - Horrible
Ex.:
Travelling Salesman Problem.
Big-Omega
Also Omega states that a given function will be of a complexity of AT LEAST some asymptotic notation.
It can be more.
Theta
Denotes the asymptotically tight bound on the growth rate of runtime of an algorithm.
This means that it takes exactly the asymptotic time:
No more than .
And no less than .
Last updated