Complexity or Asymptotic Behavior
Last updated
Last updated
Describes the effort given an input of size.
This is used to analyze an algorithm efficiency as your input goes to . (Growth Rate)
How drastically the time grows as the input grows.
In the real world
Memory growth is not computationally free. In languages like Go
or Javascript
you pay even heavier penalties because the memory can be kept around, grows faster, and causes complete halts in the program for cleanup.
But these things are not considered for Complexity calculations.
Rules:
Ignore constants:
Constants are any value that do not scale with the input.
Constant values are ignored.
Ex.: , the is ignored.
Ignore lower order terms:
Ex.: , the is ignored.
Ex.: , is also ignored.
Describes the upper-bound, or the worst case for a given function.
In the code evaluation, you must always consider the costliest paths.
It can be less.
Constant - Excellent
Logarithm - Excellent
It is when the algorithm has a behavior close to "Divide to Conquer". Meaning you just keep dividing until there is just 1 element left. (You cannot scan linear search
the divided parts)
When you divide the data and analyses just part that needs.
It is much closer to Constant Time
than Linear
.
The larger the input the more constant it gets.
Ex.:
Search for a name in a list. You don't have to look the entire list, but start from the names where the first letter is the same as the one you want. Ex.: Binary Search.
Linear - Good
The cost is linearly proportional as the input.
Recursions end up being just like regular loops.
Linearithmic ou Quasilinear - Bad
Ex.:
Some search algorithms like Merge Sort.
Quadratic - Horrible
Ex.:
Nested loops
Exponential - Horrible
Ex.:
Hanoi Tower.
Factorial - Horrible
Ex.:
Travelling Salesman Problem.
Describes the lower-bounds, or the best case scenarios.
It can be more.
Denotes the asymptotically tight bound on the growth rate of runtime of an algorithm.
This means that it takes exactly the asymptotic time:
Also Big-O states that a given function will be of a complexity of NO MORE THAN some asymptotic notation.
Given a input , the cost will always be constant , no matter how big is.
As if it were executed times.
It can be when you keep dividing your input at each step, like in , but you do scan the part each time.
The cost rises quadratically as .
The cost rises exponentially as .
Also Omega states that a given function will be of a complexity of AT LEAST some asymptotic notation.
No more than .
And no less than .