Complexity or Asymptotic Behavior

Describes the effort given an input of NN size.

  • This is used to analyze an algorithm efficiency as your input goes to \infty. (Growth Rate)

  • How drastically the time grows as the input grows.

In the real world

Memory growth is not computationally free. In languages like Go or Javascript you pay even heavier penalties because the memory can be kept around, grows faster, and causes complete halts in the program for cleanup.

But these things are not considered for Complexity calculations.

Rules:

  • Ignore constants:

    • Constants are any value that do not scale with the input.

    • Constant values are ignored.

    • Ex.: , the 22 is ignored.

  • Ignore lower order terms:

    • Ex.: O(N)+10O(N) + 10, the 1010 is ignored.

    • Ex.: O(N2)+10NO(N^2) + 10N, 10N10N is also ignored.

Big-O (O)(O)

Describes the upper-bound, or the worst case for a given function.

Also Big-O states that a given function f()f() will be of a complexity of NO MORE THAN some asymptotic notation.

It can be less.

O(1)

Given a input NN, the cost will always be constant (1)(1), no matter how big NN is.

O(log N)

It is when the algorithm has a behavior close to "Divide to Conquer". Meaning you just keep dividing until there is just 1 element left. (You cannot scan linear search the divided parts)

When you divide the data and analyses just part that needs.

It is much closer to Constant Time than Linear.

The larger the input the more constant it gets.

Ex.:

Search for a name in a list. You don't have to look the entire list, but start from the names where the first letter is the same as the one you want. Ex.: Binary Search.

O(N)

The cost is linearly proportional as the input.

Recursions end up being just like regular loops.

O(N log N)

As if it were log Nlog\ N executed NN times.

It can be when you keep dividing your input at each step, like in O(logN)O(log N), but you do scan the part each time.

Ex.:

Some search algorithms like Merge Sort.

O(N²)

The cost rises quadratically as NN.

Ex.:

Nested loops

O(2^N)

The cost rises exponentially as NN.

Ex.:

Hanoi Tower.

O(N!)

Ex.:

Travelling Salesman Problem.

Big-Omega (Ω)(\Omega)

Describes the lower-bounds, or the best case scenarios.

Also Omega states that a given function f()f() will be of a complexity of AT LEAST some asymptotic notation.

It can be more.

Theta (Θ)(\Theta)

Denotes the asymptotically tight bound on the growth rate of runtime of an algorithm.

This means that it takes exactly the asymptotic time:

  • No more than O()O().

  • And no less than Ω()\Omega().

Ω()=Θ()=O()\Omega() = \Theta() = O()

Last updated